Information-Theoretic Bounds on the Moments of the Generalization Error of Learning Algorithms

02/03/2021
by   Gholamali Aminian, et al.
15

Generalization error bounds are critical to understanding the performance of machine learning models. In this work, building upon a new bound of the expected value of an arbitrary function of the population and empirical risk of a learning algorithm, we offer a more refined analysis of the generalization behaviour of a machine learning models based on a characterization of (bounds) to their generalization error moments. We discuss how the proposed bounds – which also encompass new bounds to the expected generalization error – relate to existing bounds in the literature. We also discuss how the proposed generalization error moment bounds can be used to construct new generalization error high-probability bounds.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset