Mixed neural network Gaussian processes

12/01/2021
by   Alexey Lindo, et al.
0

This paper makes two contributions. Firstly, it introduces mixed compositional kernels and mixed neural network Gaussian processes (NGGPs). Mixed compositional kernels are generated by composition of probability generating functions (PGFs). A mixed NNGP is a Gaussian process (GP) with a mixed compositional kernel, arising in the infinite-width limit of multilayer perceptrons (MLPs) that have a different activation function for each layer. Secondly, θ activation functions for neural networks and θ compositional kernels are introduced by building upon the theory of branching processes, and more specifically upon θ PGFs. While θ compositional kernels are recursive, they are expressed in closed form. It is shown that θ compositional kernels have non-degenerate asymptotic properties under certain conditions. Thus, GPs with θ compositional kernels do not require non-explicit recursive kernel evaluations and have controllable infinite-depth asymptotic properties. An open research question is whether GPs with θ compositional kernels are limits of infinitely-wide MLPs with θ activation functions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset