Flexible Rectified Linear Units for Improving Convolutional Neural Networks

06/25/2017
by   Suo Qiu, et al.
0

Rectified linear unit (ReLU) is a widely used activation function for deep convolutional neural networks. In this paper, we propose a novel activation function called flexible rectified linear unit (FReLU). FReLU improves the flexibility of ReLU by a learnable rectified point. FReLU achieves a faster convergence and higher performance. Furthermore, FReLU does not rely on strict assumptions by self-adaption. FReLU is also simple and effective without using exponential function. We evaluate FReLU on two standard image classification dataset, including CIFAR-10 and CIFAR-100. Experimental results show the strengths of the proposed method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset