# High-dimensional estimation from sum-of-squares proofs

with Prasad Raghavendra, Tselil Schramm. **ICM 2018.**

## abstract

Estimation is the computational task of recovering a *hidden parameter* $x$ associated with a distribution $\mathcal D_x$, given a *measurement* $y$ sampled from the distribution. High dimensional estimation problems arise naturally in statistics, machine learning, and complexity theory.

Many high dimensional estimation problems can be formulated as systems of polynomial equalities and inequalities, and thus give rise to natural probability distributions over polynomial systems. Sum-of-squares proofs provide a powerful framework to reason about polynomial systems, and further there exist efficient algorithms to search for low-degree sum-of-squares proofs.

Understanding and characterizing the power of sum-of-squares proofs for estimation problems has been a subject of intense study in recent years. On one hand, there is a growing body of work utilizing sum-of-squares proofs for recovering solutions to polynomial systems when the system is feasible. On the other hand, a general technique referred to as *pseudocalibration* has been developed towards showing lower bounds on the degree of sum-of-squares proofs. Finally, the existence of sum-of-squares refutations of a polynomial system has been shown to be intimately connected to the existence of spectral algorithms. In this article we survey these developments.

## keywords

- sum-of-squares method
- estimation
- machine learning
- semidefinite programming