作者: Inderjit S. Dhillon , Brian Kulis , Suvrit Sra
DOI:
关键词:
摘要: Many important machine learning problems are modeled and solved via semidefinite programs; examples include metric learning, nonlinear embedding, certain clustering problems. Often, off-the-shelf software is invoked for the associated optimization, which can be inappropriate due to excessive computational storage requirements. In this paper, we introduce use of convex perturbations solving programs (SDPs), a specific perturbation derive an algorithm that has several advantages over existing techniques: a) it simple, requiring only few lines MATLAB, b) first-order method, thereby scalable, c) easily exploit structure given SDP (e.g., when constraint matrices low-rank, situation common SDPs). A pleasant byproduct our method fast, kernelized version large-margin nearest neighbor (Weinberger et al., 2005). We demonstrate effective in finding fast approximations large-scale SDPs arising some applications.