Fast Regression for Structured Inputs

03/14/2022
by   Raphael A. Meyer, et al.
0

We study the ℓ_p regression problem, which requires finding 𝐱∈ℝ^d that minimizes 𝐀𝐱-𝐛_p for a matrix 𝐀∈ℝ^n × d and response vector 𝐛∈ℝ^n. There has been recent interest in developing subsampling methods for this problem that can outperform standard techniques when n is very large. However, all known subsampling approaches have run time that depends exponentially on p, typically, d^𝒪(p), which can be prohibitively expensive. We improve on this work by showing that for a large class of common structured matrices, such as combinations of low-rank matrices, sparse matrices, and Vandermonde matrices, there are subsampling based methods for ℓ_p regression that depend polynomially on p. For example, we give an algorithm for ℓ_p regression on Vandermonde matrices that runs in time 𝒪(nlog^3 n+(dp^2)^0.5+ω·polylog n), where ω is the exponent of matrix multiplication. The polynomial dependence on p crucially allows our algorithms to extend naturally to efficient algorithms for ℓ_∞ regression, via approximation of ℓ_∞ by ℓ_𝒪(log n). Of practical interest, we also develop a new subsampling algorithm for ℓ_p regression for arbitrary matrices, which is simpler than previous approaches for p ≥ 4.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro