We develop fast spectral algorithms for tensor decomposition that match the robustness guarantees of the best known polynomial-time algorithms for this problem based on the sum-of-squares (SOS) semidefinite programming hierarchy.
Our algorithms can decompose a 4-tensor with \(n\)-dimensional orthonormal components in the presence of error with constant spectral norm (when viewed as an \(n^2\)-by-\(n^2\) matrix). The running time is \(n^5\) which is close to linear in the input size \(n^4\).
We also obtain algorithms with similar running time to learn sparsely-used orthogonal dictionaries even when feature representations have constant relative sparsity and non-independent coordinates.
The only previous polynomial-time algorithms to solve these problem are based on solving large semidefinite programs. In contrast, our algorithms are easy to implement directly and are based on spectral projections and tensor-mode rearrangements.
Or work is inspired by recent of Hopkins, Schramm, Shi, and Steurer (STOC’16) that shows how fast spectral algorithms can achieve the guarantees of SOS for average-case problems. In this work, we introduce general techniques to capture the guarantees of SOS for worst-case problems.
sum-of-squares method, tensor computations, eigenvalues, machine learning, semidefinite programming.