On the role of synaptic stochasticity in training low-precision neural networks

10/26/2017
by   Carlo Baldassi, et al.
0

Stochasticity and limited precision of synaptic weights in neural network models is a key aspect of both biological and hardware modeling of learning processes. Here we show that a neural network model with stochastic binary weights naturally gives prominence to exponentially rare dense regions of solutions with a number of desirable properties such as robustness and good generalization per- formance, while typical solutions are isolated and hard to find. Binary solutions of the standard perceptron problem are obtained from a simple gradient descent procedure on a set of real values parametrizing a probability distribution over the binary synapses. Both analytical and numerical results are presented. An algorithmic extension aimed at training discrete deep neural networks is also investigated.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset