research
∙
11/08/2021
SMU: smooth activation function for deep networks using smoothing maximum technique
Deep learning researchers have a keen interest in proposing two new nove...
research
∙
09/27/2021
SAU: Smooth activation function using convolution with approximate identities
Well-known activation functions like ReLU or Leaky ReLU are non-differen...
research
∙
09/09/2021
ErfAct and PSerf: Non-monotonic smooth trainable Activation Functions
An activation function is a crucial component of a neural network that i...
research
∙
06/17/2021
Orthogonal-Padé Activation Functions: Trainable Activation functions for smooth and faster convergence in deep networks
We have proposed orthogonal-Padé activation functions, which are trainab...
research
∙
09/28/2020
EIS – a family of activation functions combining Exponential, ISRU, and Softplus
Activation functions play a pivotal role in the function learning using ...
research
∙
09/08/2020