Algorithmic thresholds for tensor PCA

08/02/2018
by   Gérard Ben Arous, et al.
0

We study the algorithmic thresholds for principal component analysis of Gaussian k-tensors with a planted rank-one spike, via Langevin dynamics and gradient descent. In order to efficiently recover the spike from natural initializations, the signal to noise ratio must diverge in the dimension. Our proof shows that the mechanism for the success/failure of recovery is the strength of the "curvature" of the spike on the maximum entropy region of the initial data. To demonstrate this, we study the dynamics on a generalized family of high-dimensional landscapes with planted signals, containing the spiked tensor models as specific instances. We identify thresholds of signal-to-noise ratios above which order 1 time recovery succeeds; in the case of the spiked tensor model these match the thresholds conjectured for algorithms such as Approximate Message Passing. Below these thresholds, where the curvature of the signal on the maximal entropy region is weak, we show that recovery from certain natural initializations takes at least stretched exponential time. Our approach combines global regularity estimates for spin glasses with point-wise estimates, to study the recovery problem by a perturbative approach.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset