Speaker
Description
Motivated by the conditional gradient descent methods, a.k.a the Frank-Wolfe algorithms in the smooth case, we present here an adaptation of these Frank-Wolfe methods for non-smooth problems. Needless to say the smooth Frank-Wolfe algorithms have seen many applications in various fields of optimization and data science. We analysed the non-smooth adaptation of the Frank-Wolfe algorithms called Abs-Smooth Frank-Wolfe method. We prove convergence rates of the Primal-Dual convergence for convex abs-smooth functions similar to the smooth setting. We also look into various factors that help in accelerating these convergence rates. The approach has now been implemented and made available as a Julia package. This package - AbsSmoothFrankWolfe.jl - was tested using various non-smooth benchmark problems and the convergence rates were verified.