
Research Article
CancersQA: Federated Learning with Pre-trained Models for Intelligent Medical Diagnosis
@INPROCEEDINGS{10.1007/978-3-031-71716-1_14, author={Kunyu Yang and Luyao Peng and Zheng Liu and Chaomurilige Wang and Yihang Dai}, title={CancersQA: Federated Learning with Pre-trained Models for Intelligent Medical Diagnosis}, proceedings={Machine Learning and Intelligent Communication. 8th EAI International Conference, MLICOM 2023, Beijing, China, December 17, 2023, Proceedings}, proceedings_a={MLICOM}, year={2024}, month={9}, keywords={Federated learning Pretrained model Medical Question Answering deep learning natural language processing}, doi={10.1007/978-3-031-71716-1_14} }
- Kunyu Yang
Luyao Peng
Zheng Liu
Chaomurilige Wang
Yihang Dai
Year: 2024
CancersQA: Federated Learning with Pre-trained Models for Intelligent Medical Diagnosis
MLICOM
Springer
DOI: 10.1007/978-3-031-71716-1_14
Abstract
Pre-trained Models (PTM) have demonstrated extraordinary efficacy in the domain of natural language processing (NLP), particularly in addressing critical challenges within question-answering (QA) systems. However, the medical QA field grapples with issues such as the low robustness of generation models and a shortage of available medical data, which is further complicated by stringent privacy requirements. Leveraging the synergy between Federated Learning (FL) and PTM, we propose an innovative CancersQA method for intelligent cancer diagnosis and effective aggregating algorithm WeiPro. This method employs the GPT-2 model in an FL framework and fine-tunes it using sparse and unevenly distributed medical consultation data. Our novel method is designed to perform optimally on small sample datasets, markedly reducing training time. Our WeiPro aggregation strategy consolidates the parameters of the FL clients by taking into account various factors. Finally, we conducted rigorous experiments to evaluate CancersQA’s performance. Our experimental results indicate that our approach significantly surpasses other models in medical QA tasks underscoring its superior potential for adapting large language models to healthcare’s critical domain.