Authors - Shamita Jagarlamudi, Soormayee Joshi, Aman Aditya, Anushka Gangwar, Pratvina Talele Abstract - Federated Learning (FL) is a privacy-preserving, distributed learning framework where models are trained locally on client devices, and only the trained parameters are shared with a central server. Nevertheless, FL encounters substantial obstacles in real-world applications due to data heterogeneity, such as non-IID distributions leading to local inconsistencies and client drift thereby diminishing global model efficacy. To tackle these challenges, we propose a Federated Prox Drift Correction (FedPDC), an effective and practical method designed to mitigate client drift and local overfitting through the use of drift correction and proximal terms. Comprehensive experiments conducted on public datasets demonstrate that FedPDC performance is superior compared to state-of-the-art methods.