A brief note on understanding neural networks as Gaussian processes

07/25/2021
by   Mengwu Guo, et al.
0

As a generalization of the work in [Lee et al., 2017], this note briefly discusses when the prior of a neural network output follows a Gaussian process, and how a neural-network-induced Gaussian process is formulated. The posterior mean functions of such a Gaussian process regression lie in the reproducing kernel Hilbert space defined by the neural-network-induced kernel. In the case of two-layer neural networks, the induced Gaussian processes provide an interpretation of the reproducing kernel Hilbert spaces whose union forms a Barron space.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset