Exact tensor completion with sum-of-squares

with Aaron Potechin. COLT 2017.

PDF

abstract

We obtain the first polynomial-time algorithm for exact tensor completion that improves over the bound implied by reduction to matrix completion. The algorithm recovers an unknown 3-tensor with rr incoherent, orthogonal components in Rn\R^n from rO~(n1.5)r \cdot \tilde O(n^{1.5}) randomly observed entries of the tensor. This bound improves over the previous best one of rO~(n2)r\cdot \tilde O(n^{2}) by reduction to exact matrix completion. Our bound also matches the best known results for the easier problem of approximate tensor completion (Barak & Moitra, 2015).

Our algorithm and analysis extends seminal results for exact matrix completion (Candes & Recht, 2009) to the tensor setting via the sum-of-squares method. The main technical challenge is to show that a small number of randomly chosen monomials are enough to construct a degree-3 polynomial with a precisely planted orthogonal global optima over the sphere and that this fact can be certified within the sum-of-squares proof system.

keywords

  • sum-of-squares method
  • tensor computations
  • eigenvalues
  • machine learning
  • semidefinite programming