Pruning of Convolutional Neural Networks Using Ising Energy Model
Pruning is one of the major methods to compress deep neural networks. In this paper, we propose an Ising energy model within an optimization framework for pruning convolutional kernels and hidden units. This model is designed to reduce redundancy between weight kernels and detect inactive kernels/hidden units. Our experiments using ResNets, AlexNet, and SqueezeNet on CIFAR-10 and CIFAR-100 datasets show that the proposed method on average can achieve a pruning rate of more than 50% of the trainable parameters with approximately <10% and <5% drop of Top-1 and Top-5 classification accuracy, respectively.
READ FULL TEXT