Tensor decompositions, sum-of-squares proofs, and spectral algorithms
Simons Institute in Berkeley, Quarterly Northwestern theory workshop, University of Pennsylvania theory lunch.
The problem of finding low-rank decompositions of tensors has a wide-range applications, e.g., for learning various mixture models. In the overcomplete regime—when the rank is superlinear in the dimension—most known provable decomposition methods for third-order tensors break down.
We describe a polynomial-time algorithm based on the sum-of-squares method that is able to approximately decompose random n-dimensional third-order tensors of rank up to . The previous best algorithm by Ge and Ma also used sum-of-squares but required quasi-polynomial time to achieve such decompositions.
Finally, we demonstrate how to reap the benefits of the sum-of-squares method without solving large semidefinite programs: Using the sum-of-squares toolkit, we devise new kinds of spectral algorithms that achieve similar decomposition guarantees as sum-of-squares but with running times that are close to linear in the size of the input (exponent 1.125 using fast matrix-multiplication).
Based on joint works with Sam Hopkins, Tengyu Ma, Tselil Schramm, and Jonathan Shi.