作者: Jie Wang , Wei Fan , Jieping Ye
DOI: 10.1109/TPAMI.2014.2388203
关键词: Affine transformation 、 Feature Dimension 、 Monotonic function 、 Lasso (statistics) 、 Mathematical optimization 、 Smoothness 、 Feature extraction 、 Mathematics 、 Dual (category theory) 、 Speedup
摘要: Fused Lasso is a popular regression technique that encodes the smoothness of data. It has been applied successfully to many applications with smooth feature structure. However, computational cost existing solvers for fused prohibitive when dimension extremely large. In this paper, we propose novel screening rules are able quickly identity adjacent features same coefficients. As result, number variables be estimated can significantly reduced, leading substantial savings in and memory usage. To best our knowledge, proposed approach first attempt develop methods problem general data matrix. Our major contributions are: 1) derive new dual formulation comes several desirable properties; 2) show equivalent standard by two affine transformations; 3) framework developing effective efficient f used La sso via m onotonicity s ubdifferentials (FLAMS). Some appealing FLAMS safe sense detected guaranteed have coefficients; dataset needs scanned only once run screening, whose negligible compared solving Lasso; (3) independent integrated any solvers. We evaluated on both synthetic real datasets. The experiments indicate very identifying speedup gained orders magnitude.