An Empirical Analysis of the Advantages of Finite- v.s. Infinite-Width Bayesian Neural Networks

11/16/2022
by   Jiayu Yao, et al.
0

Comparing Bayesian neural networks (BNNs) with different widths is challenging because, as the width increases, multiple model properties change simultaneously, and, inference in the finite-width case is intractable. In this work, we empirically compare finite- and infinite-width BNNs, and provide quantitative and qualitative explanations for their performance difference. We find that when the model is mis-specified, increasing width can hurt BNN performance. In these cases, we provide evidence that finite-width BNNs generalize better partially due to the properties of their frequency spectrum that allows them to adapt under model mismatch.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset