Deep Expander Networks: Efficient Deep Networks from Graph Theory

11/23/2017
by   Ameya Prabhu, et al.
0

Deep Neural Networks, while being unreasonably effective for several vision tasks, have their usage limited by the computational and memory requirements, both during training and inference stages. Analyzing and improving the connectivity patterns between layers of a network has resulted in several compact architectures like GoogleNet, ResNet and DenseNet-BC. In this work, we utilize results from graph theory to develop an efficient connection pattern between consecutive layers. Specifically, we use expander graphs that have excellent connectivity properties to develop a sparse network architecture, the deep expander network (X-Net). The X-Nets are shown to have high connectivity for a given level of sparsity. We also develop highly efficient training and inference algorithms for such networks. Experimental results show that we can achieve the similar or better accuracy as DenseNet-BC with two-thirds the number of parameters and FLOPs on several image classification benchmarks. We hope that this work motivates other approaches to utilize results from graph theory to develop efficient network architectures.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset