作者: Yiu-ming Cheung , Jian Lou
关键词: Mathematical optimization 、 Matrix norm 、 Norm (mathematics) 、 Convex relaxation 、 Computer science 、 Low-rank approximation 、 Scalability 、 Subspace topology 、 Sparse matrix 、 Regularization (mathematics) 、 Computation
摘要: As a fundamental tool in the fields of data mining and computer vision, robust low rank subspace learning is to recover matrix under gross corruptions that are often modeled by another sparse matrix. Within this learning, we investigate spectral k-support norm, more appealing convex relaxation than popular nuclear as penalty paper. Despite better recovering performance, norm entails model difficult be optimized efficiently, which severely limits its scalability from practical perspective. Therefore, paper proposes scalable efficient algorithm considers dual objective original problem can take advantage computational linear oracle evaluated. Further, studying sub-gradient loss objective, line-search strategy adopted enable it adapt Holder smoothness. Experiments on various tasks demonstrate superior prediction performance computation efficiency proposed algorithm.