The Neural Coding Framework for Learning Generative Models

12/07/2020
by   Alexander Ororbia, et al.
0

Neural generative models can be used to learn complex probability distributions from data, to sample from them, and to produce probability density estimates. We propose a novel neural generative model inspired by the theory of predictive processing in the brain. According to predictive processing theory, the neurons in the brain form a hierarchy in which neurons in one level form expectations about sensory inputs from another level. These neurons update their local models based on differences between their expectations and the observed signals. In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality. This neural generative model performs very well in practice. On a variety of benchmark datasets and metrics, it either remains competitive with or significantly outperforms other generative models with similar functionality (such as the variational auto-encoder).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset