作者: Eugene Ndiaye , Olivier Fercoq , Alexandre Gramfort , Vincent Leclère , Joseph Salmon
DOI: 10.1088/1742-6596/904/1/012006
关键词: Uncertainty quantification 、 Elastic net regularization 、 Algorithm 、 Artificial intelligence 、 Machine learning 、 Curve fitting 、 Solver 、 Coordinate descent 、 Leverage (statistics) 、 Mathematics 、 Computation 、 Numerical stability
摘要: In high dimensional settings, sparse structures are crucial for efficiency, both in term of memory, computation and performance. It is customary to consider $\ell_1$ penalty enforce sparsity such scenarios. Sparsity enforcing methods, the Lasso being a canonical example, popular candidates address dimension. For they rely on tuning parameter trading data fitting versus sparsity. theory hold this should be proportional noise level, yet latter often unknown practice. A possible remedy jointly optimize over regression as well level. This has been considered under several names literature: Scaled-Lasso, Square-root Lasso, Concomitant estimation instance, could interest confidence sets or uncertainty quantification. work, after illustrating numerical difficulties Smoothed formulation, we propose modification coined aimed at increasing stability. We an efficient accurate solver leading computational cost no more expansive than one Lasso. leverage standard ingredients behind success fast solvers: coordinate descent algorithm, combined with safe screening rules achieve speed by eliminating early irrelevant features.