作者: Cun Mu , Yuqian Zhang , John Wright , Donald Goldfarb
DOI: 10.1137/15M101628X
关键词:
摘要: Recovering matrices from compressive and grossly corrupted observations is a fundamental problem in robust statistics, with rich applications computer vision machine learning. In theory, under certain conditions, this can be solved polynomial time via natural convex relaxation, known as Compressive Principal Component Pursuit (CPCP). However, all existing provable algorithms for CPCP suffer superlinear per-iteration cost, which severely limits their applicability to large scale problems. paper, we propose provable, scalable efficient methods solve (essentially) linear cost. Our method combines classical ideas Frank-Wolfe proximal methods. each iteration, mainly exploit update the low-rank component rank-one SVD step sparse term. Convergence results implementation details are also discussed. We demonstrate scalability of proposed approach promising numerical experiments on visual data.