Sciweavers

PAKDD
2015
ACM

A Learning-Rate Schedule for Stochastic Gradient Methods to Matrix Factorization

8 years 7 months ago
A Learning-Rate Schedule for Stochastic Gradient Methods to Matrix Factorization
Abstract. Stochastic gradient methods are effective to solve matrix factorization problems. However, it is well known that the performance of stochastic gradient method highly depends on the learning rate schedule used; a good schedule can significantly boost the training process. In this paper, motivated from past works on convex optimization which assign a learning rate for each variable, we propose a new schedule for matrix factorization. The experiments demonstrate that the proposed schedule leads to faster convergence than existing ones. Our schedule uses the same parameter on all data sets included in our experiments; that is, the time spent on learning rate selection can be significantly reduced. By applying this schedule to a state-of-the-art matrix factorization package, the resulting implementation outperforms available parallel matrix factorization packages.
Wei-Sheng Chin, Yong Zhuang, Yu-Chin Juan, Chih-Je
Added 16 Apr 2016
Updated 16 Apr 2016
Type Journal
Year 2015
Where PAKDD
Authors Wei-Sheng Chin, Yong Zhuang, Yu-Chin Juan, Chih-Jen Lin
Comments (0)