On the analysis of optimization with fixed-rank matrices: a quotient geometric view

03/13/2022
by   Shuyu Dong, et al.
0

We study a type of Riemannian gradient descent (RGD) algorithm, designed through Riemannian preconditioning, for optimization on ℳ_k^m× n – the set of m× n real matrices with a fixed rank k. Our analysis is based on a quotient geometric view of ℳ_k^m× n: by identifying this set with the quotient manifold of a two-term product space ℝ_*^m× k×ℝ_*^n× k of matrices with full column rank via matrix factorization, we find an explicit form for the update rule of the RGD algorithm, which leads to a novel approach to analysing their convergence behavior in rank-constrained optimization. We then deduce some interesting properties that reflect how RGD distinguishes from other matrix factorization algorithms such as those based on the Euclidean geometry. In particular, we show that the RGD algorithm are not only faster than Euclidean gradient descent but also do not rely on the balancing technique to ensure its efficiency while the latter does. Starting from the novel results, we further show that this RGD algorithm is guaranteed to solve matrix sensing and matrix completion problems with linear convergence rate, under mild conditions related to the restricted positive definiteness property. Numerical experiments on matrix sensing and completion are provided to demonstrate these properties.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset