作者: Andrew Gitlin , Biaoshuai Tao , Laura Balzano , John Lipor
DOI: 10.1109/JSTSP.2018.2869363
关键词: Linear subspace 、 Computer science 、 Subspace topology 、 Cluster analysis 、 Outlier 、 Computational complexity theory 、 Synthetic data 、 Algorithm design 、 Initialization 、 Algorithm
摘要: Subspace clustering is a powerful generalization of for high-dimensional data analysis, where low-rank cluster structure leveraged accurate inference. $K$ -Subspaces (KSS), an alternating algorithm that mirrors -means, classical approach with this model. Like KSS highly sensitive to initialization, yet has two major handicaps beyond issue. First, unlike the objective NP-hard approximate within any finite factor large enough subspace rank. Second, it known $\ell _2$ estimation step faulty when estimated points from multiple subspaces. In paper, we demonstrate both these additional drawbacks, provide proof former, and offer solution latter through use robust recovery (RSR) method as coherence pursuit (CoP). While many RSR methods have been developed in recent years, few can handle case outliers are themselves low We prove CoP outliers. This its computational complexity make ideal incorporate into KSS. on synthetic successfully rejects show combining yields state-of-the-art performance canonical benchmark datasets.