Nuances in Margin Conditions Determine Gains in Active Learning

10/16/2021
by   Samory Kpotufe, et al.
0

We consider nonparametric classification with smooth regression functions, where it is well known that notions of margin in E[Y|X] determine fast or slow rates in both active and passive learning. Here we elucidate a striking distinction between the two settings. Namely, we show that some seemingly benign nuances in notions of margin – involving the uniqueness of the Bayes classifier, and which have no apparent effect on rates in passive learning – determine whether or not any active learner can outperform passive learning rates. In particular, for Audibert-Tsybakov's margin condition (allowing general situations with non-unique Bayes classifiers), no active learner can gain over passive learning in commonly studied settings where the marginal on X is near uniform. Our results thus negate the usual intuition from past literature that active rates should improve over passive rates in nonparametric settings.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset