A Faster Interior-Point Method for Sum-of-Squares Optimization
We present a faster interior-point method for optimizing sum-of-squares (SOS) polynomials, which are a central tool in polynomial optimization and capture convex programming in the Lasserre hierarchy. Let p = ∑_i q^2_i be an n-variate SOS polynomial of degree 2d. Denoting by L := n+dd and U := n+2d2d the dimensions of the vector spaces in which q_i's and p live respectively, our algorithm runs in time Õ(LU^1.87). This is polynomially faster than state-of-art SOS and semidefinite programming solvers, which achieve runtime Õ(L^0.5min{U^2.37, L^4.24}). The centerpiece of our algorithm is a dynamic data structure for maintaining the inverse of the Hessian of the SOS barrier function under the polynomial interpolant basis, which efficiently extends to multivariate SOS optimization, and requires maintaining spectral approximations to low-rank perturbations of elementwise (Hadamard) products. This is the main challenge and departure from recent IPM breakthroughs using inverse-maintenance, where low-rank updates to the slack matrix readily imply the same for the Hessian matrix.
READ FULL TEXT