Finite Littlestone Dimension Implies Finite Information Complexity

06/27/2022
by   Aditya Pradeep, et al.
0

We prove that every online learnable class of functions of Littlestone dimension d admits a learning algorithm with finite information complexity. Towards this end, we use the notion of a globally stable algorithm. Generally, the information complexity of such a globally stable algorithm is large yet finite, roughly exponential in d. We also show there is room for improvement; for a canonical online learnable class, indicator functions of affine subspaces of dimension d, the information complexity can be upper bounded logarithmically in d.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset