About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Quality, Reliability, Security and Robustness in Heterogeneous Systems. 19th EAI International Conference, QShine 2023, Shenzhen, China, October 8 – 9, 2023, Proceedings, Part I

Research Article

Fast Convergence Federated Learning with Adaptive Gradient: An Application to Mental Healthcare Monitoring System

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-65126-7_24,
        author={Junqiao Fan and Xuehe Wang and Yuzhu Hu},
        title={Fast Convergence Federated Learning with Adaptive Gradient: An Application to Mental Healthcare Monitoring System},
        proceedings={Quality, Reliability, Security and Robustness in Heterogeneous Systems. 19th EAI International Conference, QShine 2023, Shenzhen, China, October 8 -- 9, 2023, Proceedings, Part I},
        proceedings_a={QSHINE},
        year={2024},
        month={8},
        keywords={Adaptive Gradient Federated Learning Non-IID Datasets Depression Detection},
        doi={10.1007/978-3-031-65126-7_24}
    }
    
  • Junqiao Fan
    Xuehe Wang
    Yuzhu Hu
    Year: 2024
    Fast Convergence Federated Learning with Adaptive Gradient: An Application to Mental Healthcare Monitoring System
    QSHINE
    Springer
    DOI: 10.1007/978-3-031-65126-7_24
Junqiao Fan1, Xuehe Wang1,*, Yuzhu Hu2
  • 1: The School of Artificial Intelligence, Sun Yat-Sen University
  • 2: The School of Intelligent Systems Engineering, Sun Yat-Sen University
*Contact email: wangxuehe@mail.sysu.edu.cn

Abstract

Nowadays, there is increasing demand for mental health monitoring systems to enable disease diagnoses, such as anxiety and depression. However, the privacy concerns for sensitive data impede its wide adoption. To protect data privacy, federated learning (FL) is proposed to enable decentralized collaborative model learning without sharing sensitive data. Though, FL training process can be slowed with the non-Independent-and-Identically-Distributed (non-IID) datasets across participating clients, causing extra communication costs. In this paper, we propose the FL adaptive gradient optimization method to accelerate the convergence under the context of non-IID training. As the reference direction for parameter update, the gradient has a great impact on the convergence performance throughout the training. By adaptively modifying the local gradients according to the global gradient, we reduce the local parameter divergence to enable robust training and fast convergence. Meanwhile, as an application to our FL optimization algorithm, a novel sleep monitoring system is proposed to detect potential depression. Experiments demonstrate that with our proposed method, faster convergence and higher accuracy can be realized compared to commonly adopted Federated Averaging (FedAVG) and other adaptive optimization methods, which effectively save communication costs.

Keywords
Adaptive Gradient Federated Learning Non-IID Datasets Depression Detection
Published
2024-08-20
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-65126-7_24
Copyright © 2023–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL