Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent

05/15/2014
by   Yichao Lu, et al.
0

We propose a new two stage algorithm LING for large scale regression problems. LING has the same risk as the well known Ridge Regression under the fixed design setting and can be computed much faster. Our experiments have shown that LING performs well in terms of both prediction accuracy and computational efficiency compared with other large scale regression algorithms like Gradient Descent, Stochastic Gradient Descent and Principal Component Regression on both simulated and real datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset