Conjugate-gradient-based Adam for stochastic optimization and its application to deep learning

02/29/2020
by   Yu Kobayashi, et al.
0

This paper proposes a conjugate-gradient-based Adam algorithm blending Adam with nonlinear conjugate gradient methods and shows its convergence analysis. Numerical experiments on text classification and image classification show that the proposed algorithm can train deep neural network models in fewer epochs than the existing adaptive stochastic optimization algorithms can.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset