Online normalizer calculation for softmax

05/08/2018
by   Maxim Milakov, et al.
0

The Softmax function is ubiquitous in machine learning, multiple previous works suggested faster alternatives for it. In this paper we propose a way to compute classical Softmax with fewer memory accesses and hypothesize that this reduction in memory accesses should improve Softmax performance on actual hardware. The benchmarks confirm this hypothesis: Softmax accelerates by up to 1.3x and Softmax+TopK combined by up to 5x.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset