Rounding sum-of-squares relaxations
with Boaz Barak, Jonathan Kelner. STOC 2014.
abstract
We present a general approach to rounding semidefinite programming relaxations obtained by the Sum-of-Squares method (Lasserre hierarchy). Our approach is based on using the connection between these relaxations and the Sum-of-Squares proof system to transform a combining algorithm—an algorithm that maps a distribution over solutions into a (possibly weaker) solution—into a rounding algorithm that maps a solution of the relaxation to a solution of the original problem.
Using this approach, we obtain algorithms that yield improved results for natural variants of three well-known problems:
We give a quasipolynomial-time algorithm that approximates within an additive factor of , where is a constant, is a degree , -variate polynomial with nonnegative coefficients, and is the spectral norm of a matrix corresponding to ’s coefficients. Beyond being of interest in its own right, obtaining such an approximation for general polynomials (with possibly negative coefficients) is a long-standing open question in quantum information theory, and our techniques have already led to improved results in this area (Brandão and Harrow, STOC ’13).
We give a polynomial-time algorithm that, given a subspace of dimension that (almost) contains the characteristic function of a set of size , finds a vector that satisfies . This is a natural analytical relaxation of the problem of finding the sparsest element in a subspace, and is also motivated by a connection to the Small Set Expansion problem shown by Barak et al. (STOC 2012). In particular our results yield an improvement of the previous best known algorithms for small set expansion in a certain range of parameters.
We use this notion of vs. sparsity to obtain a polynomial-time algorithm with substantially improved guarantees for recovering a planted sparse vector in a random -dimensional subspace of . If has nonzero coordinates, we can recover it with high probability whenever . In particular, when , this recovers a planted vector with up to nonzero coordinates. When , our algorithm improves upon existing methods based on comparing the and norms, which intrinsically require .
keywords
- sum-of-squares method
- semidefinite programming
- machine learning
- small-set expansion