Computing Matrix Squareroot via Non Convex Local Search

作者: Sham M. Kakade , Praneeth Netrapalli , Prateek Jain , Chi Jin

DOI:

关键词: Upper and lower boundsCombinatoricsMatrix (mathematics)Positive-definite matrixExponentEigendecomposition of a matrixNumerical linear algebraMatrix multiplicationCondition numberMathematics

摘要: We consider the problem of computing squareroot a positive semidefinite (PSD) matrix. Several fast algorithms (some based on eigenvalue decomposition and some Taylor expansion) are known to solve this problem. In paper, we propose another way problem: natural algorithm performing gradient descent non-convex formulation matrix show that an $n\times n$ input PSD ${M}$, if initial point is well conditioned, then finds $\epsilon$-accurate solution in $O\left(\kappa^{3/2} \log \frac{\left\|{M}\right\|_F}{\epsilon}\right)$ iterations, where $\kappa$ condition number $M$. Each iteration involves three multiplications (and does not use either inversions or solutions linear system), giving total run time $O\left(n^{\omega}\kappa^{3/2}\log\frac{\left\|{M}\right\|_F}{\epsilon}\right)$, $\omega$ multiplication exponent. Furthermore our robust errors each iteration. also lower bound $\Omega(\kappa)$ iterations for demonstrating dependence result necessary. Existing analyses similar (e.g., Newton's method) require commutativity with iterate which ensured by choosing starting carefully. Our analysis, other hand, much more general commute Consequently, guarantees convergence from wide range points. More generally, demonstrates optimization can be viable approach obtaining algorithms. argument quite believe it will find application designing such problems numerical algebra.

参考文章(0)