作者: Michael B. Cohen , Jelani Nelson , David P. Woodruff
DOI:
关键词:
摘要: We prove, using the subspace embedding guarantee in a black box way, that one can achieve spectral norm for approximate matrix multiplication with dimensionality-reducing map having $m = O(\tilde{r}/\varepsilon^2)$ rows. Here $\tilde{r}$ is maximum stable rank, i.e. squared ratio of Frobenius and operator norms, two matrices being multiplied. This quantitative improvement over previous work [MZ11, KVZ14], also optimal any oblivious map. Furthermore, due to reliance on property our proofs, theorem be applied much more general class sketching than what was known before, addition achieving better bounds. For example, apply efficient embeddings such as Subsampled Randomized Hadamard Transform or sparse embeddings, even constructions may developed future. Our main theorem, via connections error shown prior work, implies improvements least squares regression low rank approximation. Our result has already been improve dimensionality reduction guarantees $k$-means clustering [CEMMP14], new results nonparametric [YPW15]. We separately point out proof "BSS" deterministic row-sampling [BSS12] modified show $A, B$ at most $\tilde{r}$, $A^T by deterministically sampling $O(\tilde{r}/\varepsilon^2)$ rows found polynomial time. The original instead rank. observation leads stronger version [KMST10].