7–11 Apr 2025
Lecture and Conference Centre
Europe/Warsaw timezone

Information Geometry of Exponentiated Gradient: Convergence beyond L-Smoothness

9 Apr 2025, 17:30
20m
Room 0.23

Room 0.23

Speaker

Yara Elshiaty

Description

We study the minimization of smooth, possibly nonconvex functions over the positive orthant, a key setting in Poisson inverse problems, using the exponentiated gradient (EG) method. Interpreting EG as Riemannian gradient descent (RGD) with the e-Exp map from information geometry as a retraction, we prove global convergence under weak assumptions -- without the need for L-smoothness -- and finite termination of Riemannian Armijo line search. Numerical experiments, including an accelerated variant, highlight EG's practical advantages, such as faster convergence compared to RGD based on interior-point geometry.

Co-authors

Presentation materials

There are no materials yet.