Fast and robust tensor decomposition with applications to dictionary learning

06/27/2017
by   Tselil Schramm, et al.
0

We develop fast spectral algorithms for tensor decomposition that match the robustness guarantees of the best known polynomial-time algorithms for this problem based on the sum-of-squares (SOS) semidefinite programming hierarchy. Our algorithms can decompose a 4-tensor with n-dimensional orthonormal components in the presence of error with constant spectral norm (when viewed as an n^2-by-n^2 matrix). The running time is n^5 which is close to linear in the input size n^4. We also obtain algorithms with similar running time to learn sparsely-used orthogonal dictionaries even when feature representations have constant relative sparsity and non-independent coordinates. The only previous polynomial-time algorithms to solve these problem are based on solving large semidefinite programs. In contrast, our algorithms are easy to implement directly and are based on spectral projections and tensor-mode rearrangements. Or work is inspired by recent of Hopkins, Schramm, Shi, and Steurer (STOC'16) that shows how fast spectral algorithms can achieve the guarantees of SOS for average-case problems. In this work, we introduce general techniques to capture the guarantees of SOS for worst-case problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset