Skip to main content

Sam Hopkins: A meta-algorithm for Bayesian estimation

Time: Tue 2017-05-02 10.15

Location: Room 4523, Lindstedtsvägen 5, KTH CSC

Participating: (Sam Hopkins, Cornell University)

Export to calendar

Abstract: We propose a simple and efficient meta-algorithm for Bayesian estimation problems (i.e. hidden variable, latent variable, or planted problems). Our algorithm uses low-degree polynomials together with new and highly robust tensor decomposition methods. We focus on the question: for a given estimation problem, precisely how many samples (up to low-order additive terms) do polynomial-time algorithms require to obtain good estimates of hidden variables? Our meta-algorithm is broadly applicable, and achieves statistical or conjectured computational sample-complexity thresholds for many well-studied problems, including many for which previous algorithms were highly problem-specific.

We discuss two our meta-algorithm in two concrete contexts. In the stochastic block model -- a widely studied family of random graphs used to model community structure in networks -- we recover and unify the best-known algorithms for the so-called "partial recovery" problem, and obtain (conjecturally) tight sample complexity results for more realistic blockmodel variants. For sparse PCA, a useful but computationally challenging primitive in high-dimensional statistics, we prove an optimality result for our meta-algorithm, showing that it is at least as powerful as a very strong family of convex programs called the Sum-of-Squares hierarchy.

Based on joint works with Pravesh Kothari, Aaron Potechin, Prasad Raghavendra, Tselil Schramm, and David Steurer.