Second-Order Asymptotically Optimal Statistical Classification

06/03/2018
by   Lin Zhou, et al.
0

Motivated by real-world machine learning applications, we analyze approximations to the non-asymptotic fundamental limits of statistical classification. In the binary version of this problem, given two training sequences generated according to two unknown distributions P_1 and P_2, one is tasked to classify a test sequence which is known to be generated according to either P_1 or P_2. This problem can be thought of as an analogue of the binary hypothesis testing problem but in the present setting, the generating distributions are unknown. Due to finite sample considerations, we consider the second-order asymptotics (or dispersion-type) tradeoff between type-I and type-II error probabilities for tests which ensure that (i) the type-I error probability for all pairs of distributions decays exponentially fast and (ii) the type-II error probability for a particular pair of distributions is non-vanishing. We generalize our results to classification of multiple hypotheses with the rejection option.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset