Skip to content

Exponential and logarithm optimisation resources

Mamy Ratsimbazafy edited this page Dec 5, 2018 · 2 revisions

Optimising exponential and logarithm functions is critical for machine learning as many activations and loss functions are relying on those especially:

  • Negative log-likelihood and cross-entropy loss
  • sigmoid
  • softmax
  • softmax cross-entropy (using the log-sum-exp techniques)

The default implementation in <math.h> are very slow. The usual way to implement them is via polynomial approximation.

Literature

Reference discussions

with SIMD code:

Libraries