91ÉçÇø

Event

Aaron Smith (University of Ottawa)

Friday, November 29, 2024 15:30to16:30
Burnside Hall 805 rue Sherbrooke Ouest, Montreal, QC, H3A 0B9, CA

TITLE / TITRE

Free lunches and subsampling Monte Carlo

´¡µþ³§°Õ¸é´¡°ä°Õ/¸éɳ§±«²ÑÉÌý

It is well-known that the performance of MCMC algorithms degrades quite quickly when targeting computationally expensive posterior distributions, including the posteriors for even simple models when the dataset is large. This has motivated the search for MCMC variants that scale well for large datasets. One simple approach, taken by several research groups, has been to look at only a subsample of the data at every step. This method is known to work quite well for optimization, and variants of stochastic gradient descent are the workhorse of modern machine learning. In this talk, we focus on a simple "no-free-lunch" result which shows that no algorithm of this sort can provide substantial speedups for Bayesian computation. We briefly sketch the main steps in the proof, illustrate how these generic results apply to realistic statistical problems and proposed algorithms, and discuss some special examples that can avoid our generic results and provide a free (or at least cheap) lunch. We also mention recent work "in both directions," extending our basic conclusion to some non-reversible chains and showing explicitly how it can be avoided for more complex posteriors (Based on joint with Patrick Conrad, Andrew Davis, James Johndrow, Zonghao Li, Youssef Marzouk, Natesh Pillai, Pengfei Wang and Azeem Zaman.)

PLACE / LIEU

Hybride - CRM, Salle / Room 5340, Pavillon André Aisenstadt

Back to top