Matrix Completion in Almost-Verification Time
We give a new framework for solving the fundamental problem of low-rank matrix completion, i.e., approximating a rank-r matrix đââ^m Ă n (where m ⼠n) from random observations. First, we provide an algorithm which completes đ on 99% of rows and columns under no further assumptions on đ from â mr samples and using â mr^2 time. Then, assuming the row and column spans of đ satisfy additional regularity properties, we show how to boost this partial completion guarantee to a full matrix completion algorithm by aggregating solutions to regression problems involving the observations. In the well-studied setting where đ has incoherent row and column spans, our algorithms complete đ to high precision from mr^2+o(1) observations in mr^3 + o(1) time (omitting logarithmic factors in problem parameters), improving upon the prior state-of-the-art [JN15] which used â mr^5 samples and â mr^7 time. Under an assumption on the row and column spans of đ we introduce (which is satisfied by random subspaces with high probability), our sample complexity improves to an almost information-theoretically optimal mr^1 + o(1), and our runtime improves to mr^2 + o(1). Our runtimes have the appealing property of matching the best known runtime to verify that a rank-r decomposition đđ^⤠agrees with the sampled observations. We also provide robust variants of our algorithms that, given random observations from đ + đ with đ_Fâ¤Î, complete đ to Frobenius norm distance â r^1.5Î in the same runtimes as the noiseless setting. Prior noisy matrix completion algorithms [CP10] only guaranteed a distance of ââ(n)Î.
READ FULL TEXT