Document Type
Article - preprint
Department
Claremont McKenna College, Mathematics (CMC)
Publication Date
7-22-2015
Abstract
The Kaczmarz and Gauss-Seidel methods aim to solve a linear m × n system Xβ = y by iteratively refining the solution estimate; the former uses random rows of X to update β given the corresponding equations and the latter uses random columns of X to update corresponding coordinates in β. Interest in these methods was recently revitalized by a proof of Strohmer and Vershynin showing linear convergence in expectation for a randomized Kaczmarz method variant (RK), and a similar result for the randomized Gauss-Seidel algorithm (RGS) was later proved by Lewis and Leventhal. Recent work unified the analysis of these algorithms for the overcomplete and undercomplete systems, showing convergence to the ordinary least squares (OLS) solution and the minimum Euclidean norm solution respectively. This paper considers the natural follow-up to the OLS problem, ridge regression, which solves (X∗X +λI)β = X∗y. We present particular variants of RK and RGS for solving this system and derive their convergence rates. We compare these to a recent proposal by Ivanov and Zhdanov to solve this system, that can be interpreted as randomly sampling both rows and columns, which we argue is often suboptimal. Instead, we claim that one should always use RGS (columns) when m > n and RK (rows) when m < n. This difference in behavior is simply related to the minimum eigenvalue of two related positive semidefinite matrices, X∗X +λIn and XX∗ +λIm when m > n or m < n.
Rights Information
© 2015 Hefny, Needell, Ramdas
Terms of Use & License Information
Recommended Citation
A. Hefny, D. Needell, A. Ramdas. “Rows vs. Columns: Randomized Kaczmarz or Gauss-Seidel for Ridge Regression.” 2015.