Initializing ReLU networks in an expressive subspace of weights

03/23/2021
by   Dayal Singh, et al.
12

Using a mean-field theory of signal propagation, we analyze the evolution of correlations between two signals propagating through a ReLU network with correlated weights. Signals become highly correlated in deep ReLU networks with uncorrelated weights. We show that ReLU networks with anti-correlated weights can avoid this fate and have a chaotic phase where the correlations saturate below unity. Consistent with this analysis, we find that networks initialized with anti-correlated weights can train faster (in a teacher-student setting) by taking advantage of the increased expressivity in the chaotic phase. Combining this with a previously proposed strategy of using an asymmetric initialization to reduce dead ReLU probability, we propose an initialization scheme that allows faster training and learning than the best-known methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset