Skip to yearly menu bar Skip to main content


Oral

Nonlinearly Preconditioned Gradient Methods under Generalized Smoothness

Konstantinos Oikonomidis · Jan Quan · Emanuel Laude · Panagiotis Patrinos

West Ballroom C
[ ] [ Visit Oral 3D Optimization ]
Wed 16 Jul 10 a.m. — 10:15 a.m. PDT

Abstract: We analyze nonlinearly preconditioned gradient methods for solving smooth minimization problems. We introduce a generalized smoothness property, based on the notion of abstract convexity, that is broader than Lipschitz smoothness and provide sufficient first- and second-order conditions. Notably, our framework encapsulates algorithms associated with the gradient clipping method and brings out novel insights for the class of $(L_0,L_1)$-smooth functions that has received widespread interest recently, thus allowing us to extend beyond already established methods. We investigate the convergence of the proposed method in both the convex and nonconvex setting.

Chat is not available.