作者: Constantine Caramanis , Yudong Chen , Shie Mannor
DOI:
关键词:
摘要: We consider high dimensional sparse regression with arbitrary - possibly, severe or coordinated errors in the covariates matrix. are interested understanding how many corruptions we can tolerate, while identifying correct support. To best of our knowledge, neither standard outlier rejection techniques, nor recently developed robust algorithms (that focus only on corrupted response variables), recent for dealing stochastic noise erasures, provide guarantees support recovery. As show, natural brute force algorithm that takes exponential time to find subset data and columns, yields smallest error. We explore power a simple idea: replace essential linear algebraic calculation inner product counterpart cannot be greatly affected by controlled number arbitrarily points: trimmed product. three popular uncorrupted setting: Thresholding Regression, Lasso, Dantzig selector, show counterparts obtained using provably robust.