Robust Linear Predictions: Analyses of Uniform Concentration, Fast Rates and Model Misspecification

01/06/2022
βˆ™
by   Saptarshi Chakraborty, et al.
βˆ™
0
βˆ™

The problem of linear predictions has been extensively studied for the past century under pretty generalized frameworks. Recent advances in the robust statistics literature allow us to analyze robust versions of classical linear models through the prism of Median of Means (MoM). Combining these approaches in a piecemeal way might lead to ad-hoc procedures, and the restricted theoretical conclusions that underpin each individual contribution may no longer be valid. To meet these challenges coherently, in this study, we offer a unified robust framework that includes a broad variety of linear prediction problems on a Hilbert space, coupled with a generic class of loss functions. Notably, we do not require any assumptions on the distribution of the outlying data points (π’ͺ) nor the compactness of the support of the inlying ones (ℐ). Under mild conditions on the dual norm, we show that for misspecification level Ο΅, these estimators achieve an error rate of O(max{|π’ͺ|^1/2n^-1/2, |ℐ|^1/2n^-1}+Ο΅), matching the best-known rates in literature. This rate is slightly slower than the classical rates of O(n^-1/2), indicating that we need to pay a price in terms of error rates to obtain robust estimates. Additionally, we show that this rate can be improved to achieve so-called β€œfast rates" under additional assumptions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset