TanhSoft – a family of activation functions combining Tanh and Softplus

09/08/2020
by   Koushik Biswas, et al.
54

Deep learning at its core, contains functions that are composition of a linear transformation with a non-linear function known as activation function. In past few years, there is an increasing interest in construction of novel activation functions resulting in better learning. In this work, we propose a family of novel activation functions, namely TanhSoft, with four undetermined hyper-parameters of the form tanh(αx+βe^γx)ln(δ+e^x) and tune these hyper-parameters to obtain activation functions which are shown to outperform several well known activation functions. For instance, replacing ReLU with xtanh(0.6e^x)improves top-1 classification accuracy on CIFAR-10 by 0.46 DenseNet-169 and 0.7 classification accuracy on CIFAR-100 improves by 1.24 2.57

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset