7–11 Apr 2025
Lecture and Conference Centre
Europe/Warsaw timezone

Sampling, optimization, SDEs and gradient flows

9 Apr 2025, 08:30
40m
Room 1.25

Room 1.25

Speaker

Mateusz Majka

Description

We discuss connections between the problem of approximate sampling from a given probability measure and the problem of minimizing functions defined on the space of probability measures, motivated by machine learning applications. For both these problems, one can construct continuous-time stochastic processes that converge to the target measure as time goes to infinity. Among many ways of studying convergence rates for such processes, we focus on the approach via functional inequalities. In particular, we discuss the Polyak-Lojasiewicz inequality on the space of measures and its relation to the classical log-Sobolev inequality, and we explain why it is a natural condition for obtaining exponential convergence of Fisher-Rao gradient flows. We also mention applications of such flows to solving min-max problems on the space of probability measures, motivated by the problem of training Generative Adversarial Networks. Based on joint papers with Razvan-Andrei Lascu (Heriot-Watt), Linshan Liu (Heriot-Watt) and Lukasz Szpruch (University of Edinburgh).

Primary author

Presentation materials

There are no materials yet.