Approximating Activation Functions

01/17/2020
by   Nicholas Gerard Timmons, et al.
13

ReLU is widely seen as the default choice for activation functions in neural networks. However, there are cases where more complicated functions are required. In particular, recurrent neural networks (such as LSTMs) make extensive use of both hyperbolic tangent and sigmoid functions. These functions are expensive to compute. We used function approximation techniques to develop replacements for these functions and evaluated them empirically on three popular network configurations. We find safe approximations that yield a 10 37 suitable for all cases we considered and we believe are appropriate replacements for all networks using these activation functions. We also develop ranged approximations which only apply in some cases due to restrictions on their input domain. Our ranged approximations yield a performance improvement of 20 considerably out perform the ad-hoc approximations used in Theano and the implementation of Word2Vec.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset