作者: Manolis C. Tsakiris , Rene Vidal , Daniel P. Robinson , Zhihui Zhu , Daniel Q. Naiman
DOI:
关键词: Algorithm 、 Computer science 、 RANSAC 、 Linear subspace 、 Rate of convergence 、 Geometric analysis 、 Dimension (vector space) 、 Optimization problem 、 Matrix norm 、 Subspace topology
摘要: Recent methods for learning a linear subspace from data corrupted by outliers are based on convex $\ell_1$ and nuclear norm optimization require the dimension of number to be sufficiently small. In sharp contrast, recently proposed Dual Principal Component Pursuit (DPCP) method can provably handle subspaces high solving non-convex problem sphere. However, its geometric analysis is quantities that difficult interpret not amenable statistical analysis. this paper we provide refined new show DPCP tolerate as many square inliers, thus improving upon other correct robust PCA methods. We also propose scalable Projected Sub-Gradient Method (DPCP-PSGM) it admits convergence even though underlying non-smooth. Experiments road plane detection 3D point cloud demonstrate DPCP-PSGM more efficient than traditional RANSAC algorithm, which one most popular such computer vision applications.