Poster
DTZO: Distributed Trilevel Zeroth Order Learning with Provable Non-Asymptotic Convergence
Yang Jiao · Kai Yang · Chengtao Jian
East Exhibition Hall A-B #E-1507
(1) Nested optimization has attracted significant attention in Machine Learning, with applications in areas such as meta-learning, adversarial learning, hyperparameter optimization, and continual learning. Solving nested optimization problems without relying on gradient information has become increasingly important, especially with the rise of LLMs, where commercial LLM APIs often do not expose gradients. (2) Tackling nested optimization problems without gradient information is highly challenging. We propose the first framework, DTZO, to address three-level nested optimization problems in a zeroth-order manner, and we provide theoretical guarantees for the proposed trilevel zeroth-order algorithm. (3) This helps bridge the gap between nested optimization and zeroth-order methods, making trilevel learning more widely applicable and filling an important theoretical gap.