Loading…
Thursday April 9, 2026 12:15pm - 2:15pm GMT+07

Authors - Shamita Jagarlamudi, Soormayee Joshi, Aman Aditya, Anushka Gangwar, Pratvina Talele
Abstract - Federated Learning (FL) is a privacy-preserving, distributed learning framework where models are trained locally on client devices, and only the trained parameters are shared with a central server. Nevertheless, FL encounters substantial obstacles in real-world applications due to data heterogeneity, such as non-IID distributions leading to local inconsistencies and client drift thereby diminishing global model efficacy. To tackle these challenges, we propose a Federated Prox Drift Correction (FedPDC), an effective and practical method designed to mitigate client drift and local overfitting through the use of drift correction and proximal terms. Comprehensive experiments conducted on public datasets demonstrate that FedPDC performance is superior compared to state-of-the-art methods.
Paper Presenter
Thursday April 9, 2026 12:15pm - 2:15pm GMT+07
Virtual Room F Bangkok, Thailand

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link