Document Type
Article - preprint
Department
Claremont McKenna College, Mathematics (CMC)
Publication Date
8-30-2016
Abstract
We analyze a batched variant of Stochastic Gradient Descent (SGD) with weighted sampling distribution for smooth and non-smooth objective functions. We show that by distributing the batches computationally, a significant speedup in the convergence rate is provably possible compared to either batched sampling or weighted sampling alone. We propose several computationally efficient schemes to approximate the optimal weights, and compute proposed sampling distributions explicitly for the least squares and hinge loss problems. We show both analytically and experimentally that substantial gains can be obtained
Rights Information
© 2016 Needell, Ward
Terms of Use & License Information
Recommended Citation
D. Needell and R. Ward. “Batched Stochastic Gradient Descent with Weighted Sampling.” 2016.