Stochastic Variance-reduced Gradient Descent for Low-rank Matrix Recovery from Linear Measurements

作者: Quanquan Gu , Lingxiao Wang , Xiao Zhang

DOI:

关键词:

摘要: We study the problem of estimating low-rank matrices from linear measurements (a.k.a., matrix sensing) through nonconvex optimization. propose an efficient stochastic variance reduced gradient descent algorithm to solve a optimization sensing. Our is applicable both noisy and noiseless settings. In case with observations, we prove that our converges unknown at rate up minimax optimal statistical error. And in setting, guaranteed linearly converge achieves exact recovery sample complexity. Most notably, overall computational complexity proposed algorithm, which defined as iteration times per time complexity, lower than state-of-the-art algorithms based on descent. Experiments synthetic data corroborate superiority over algorithms.

参考文章(40)
Jakub Konečný, Peter Richtárik, Semi-Stochastic Gradient Descent Methods arXiv: Machine Learning. ,(2013)
Inderjit S. Dhillon, Prateek Jain, Raghu Meka, Guaranteed Rank Minimization via Singular Value Projection neural information processing systems. ,vol. 23, pp. 937- 945 ,(2010)
Ross Boczar, Mahdi Soltanolkotabi, Benjamin Recht, Max Simchowitz, Stephen Tu, Low-rank Solutions of Linear Matrix Equations via Procrustes Flow arXiv: Optimization and Control. ,(2015)
A. Nemirovski, A. Juditsky, G. Lan, A. Shapiro, Robust Stochastic Approximation Approach to Stochastic Programming SIAM Journal on Optimization. ,vol. 19, pp. 1574- 1609 ,(2009) , 10.1137/070704277
Emmanuel J. Candès, The restricted isometry property and its implications for compressed sensing Comptes Rendus Mathematique. ,vol. 346, pp. 589- 592 ,(2008) , 10.1016/J.CRMA.2008.03.014
Guanghui Lan, An optimal method for stochastic composite optimization Mathematical Programming. ,vol. 133, pp. 365- 397 ,(2012) , 10.1007/S10107-010-0434-Y
Lin Xiao, Tong Zhang, A PROXIMAL STOCHASTIC GRADIENT METHOD WITH PROGRESSIVE VARIANCE REDUCTION Siam Journal on Optimization. ,vol. 24, pp. 2057- 2075 ,(2014) , 10.1137/140961791
Jakub Konecny, Jie Liu, Peter Richtarik, Martin Takac, Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting IEEE Journal of Selected Topics in Signal Processing. ,vol. 10, pp. 242- 255 ,(2016) , 10.1109/JSTSP.2015.2505682
Prateek Jain, Praneeth Netrapalli, Sujay Sanghavi, Low-rank matrix completion using alternating minimization Proceedings of the 45th annual ACM symposium on Symposium on theory of computing - STOC '13. pp. 665- 674 ,(2013) , 10.1145/2488608.2488693
Rie Johnson, Tong Zhang, Accelerating Stochastic Gradient Descent using Predictive Variance Reduction neural information processing systems. ,vol. 26, pp. 315- 323 ,(2013)