Some Results in Personalization and Asynchrony for Distributed Learning

May 26, 2023, ESB 2001

Cesar Uribe

, Electical and Computer Engineering

Abstract

We study the personalized federated learning problem under asynchronous updates. In this problem, each client seeks a personalized model that outperforms local and global models. We consider two optimization-based frameworks for personalization: (i) Model-Agnostic Meta-Learning (MAML) and (ii) Moreau Envelopes (ME). MAML involves learning a joint model adapted for each client through fine-tuning, whereas ME requires a bi-level optimization problem with implicit gradients to enforce personalization via regularized losses. We focus on improving the scalability of personalized federated learning by removing the synchronous communication assumption. Moreover, we extend the studied function class by removing boundedness assumptions on the gradient norm. Our main technical contribution is a unified proof for asynchronous federated learning with bounded staleness that we apply to MAML and ME personalization frameworks. For the smooth and non-convex functions class, we show the convergence of our method to a first-order stationary point. Extensions to decentralized optimization and reinforcement learning will be discussed. We illustrate the performance of our method and its tolerance to staleness through experiments for classification tasks over heterogeneous datasets.

Speaker's Bio

Cesar A. Uribe is the Louis Owen Assistant Professor at the Department of Electrical and Computer Engineering at Rice University. He received the M.Sc. degrees in systems and control from the Delft University of Technology in The Netherlands and in applied mathematics from the University of Illinois at Urbana-Champaign in 2013 and 2016, respectively. He also received the Ph.D. degree in electrical and computer engineering at the University of Illinois at Urbana-Champaign in 2018. He was a Postdoctoral Associate in the Laboratory for Information and Decision Systems-LIDS at the Massachusetts Institute of Technology-MIT until 2020 and held a visiting professor position at the Moscow Institute of Physics and Technology until 2022. His research interests include distributed learning and optimization, decentralized control, algorithm analysis, and computational optimal transport.

Video URL: