On the Gap Between Strict-Saddles and True Convexity: An Omega(log d) Lower Bound for Eigenvector Approximation

04/14/2017
by   Max Simchowitz, et al.
0

We prove a query complexity lower bound on rank-one principal component analysis (PCA). We consider an oracle model where, given a symmetric matrix M ∈R^d × d, an algorithm is allowed to make T exact queries of the form w^(i) = Mv^(i) for i ∈{1,...,T}, where v^(i) is drawn from a distribution which depends arbitrarily on the past queries and measurements {v^(j),w^(j)}_1 < j < i-1. We show that for a small constant ϵ, any adaptive, randomized algorithm which can find a unit vector v for which v^Mv> (1-ϵ)M, with even small probability, must make T = Ω( d) queries. In addition to settling a widely-held folk conjecture, this bound demonstrates a fundamental gap between convex optimization and "strict-saddle" non-convex optimization of which PCA is a canonical example: in the former, first-order methods can have dimension-free iteration complexity, whereas in PCA, the iteration complexity of gradient-based methods must necessarily grow with the dimension. Our argument proceeds via a reduction to estimating the rank-one spike in a deformed Wigner model. We establish lower bounds for this model by developing a "truncated" analogue of the χ^2 Bayes-risk lower bound of Chen et al.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset