Smooth Inter-layer Propagation of Stabilized Neural Networks for Classification
Recent work has studied the reasons for the remarkable performance of deep neural networks in image classification. We examine batch normalization on the one hand and the dynamical systems view of residual networks on the other hand. Our goal is in understanding the notions of stability and smoothness of the convergence of ResNets so as to explain when they contribute to significantly enhanced performance. We postulate that convergence stability is of importance for the trained ResNet to transfer.
READ FULL TEXT