A Friendly Smoothed Analysis of the Simplex Method

11/15/2017
by   Daniel Dadush, et al.
0

Explaining the excellent practical performance of the simplex method for linear programming has been a major topic of research for over 50 years. One of the most successful frameworks for understanding the simplex method was given by Spielman and Teng (JACM `04), who the developed the notion of smoothed analysis. Starting from an arbitrary linear program with d variables and n constraints, Spielman and Teng analyzed the expected runtime over random perturbations of the LP (smoothed LP), where variance σ^2 Gaussian noise is added to the LP data. In particular, they gave a two-stage shadow vertex simplex algorithm which uses an expected O(d^55 n^86σ^-30) number of simplex pivots to solve the smoothed LP. Their analysis and runtime was substantially improved by Deshpande and Spielman (FOCS `05) and later Vershynin (SICOMP `09). The fastest current algorithm, due to Vershynin, solves the smoothed LP using an expected O(d^3 ^7 n σ^-4 + d^9^7 n) number of pivots, improving the dependence on n from polynomial to logarithmic. While the original proof of Spielman and Teng has now been substantially simplified, the resulting analyses are still quite long and complex and the parameter dependencies far from optimal. In this work, we make substantial progress on this front, providing an improved and simpler analysis of shadow simplex methods, where our main algorithm requires an expected O(d^2 √( n)σ^-2 + d^5 ^3/2 n) number of simplex pivots. We obtain our results via an improved shadow bound, key to earlier analyses as well, combined with algorithmic techniques of Borgwardt (ZOR `82) and Vershynin. As an added bonus, our analysis is completely modular, allowing us to obtain non-trivial bounds for perturbations beyond Gaussians, such as Laplace perturbations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset