Polynomial-time tensor decompositions with sum-of-squares

with Tengyu Ma, Jonathan Shi. FOCS 2016.

PDF

abstract

We give new algorithms based on the sum-of-squares method for tensor decomposition. Our results improve the best known running times from quasi-polynomial to polynomial for several problems, including decomposing random overcomplete 3-tensors and learning overcomplete dictionaries with constant relative sparsity. We also give the first robust analysis for decomposing overcomplete 44-tensors in the smoothed analysis model.

A key ingredient of our analysis is to establish small spectral gaps in moment matrices derived from solutions to sum-of-squares relaxations. To enable this analysis we augment sum-of-squares relaxations with spectral analogs of maximum entropy constraints.

keywords

  • sum-of-squares method
  • tensor computations
  • machine learning
  • semidefinite programming