About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Collaborative Computing: Networking, Applications and Worksharing. 19th EAI International Conference, CollaborateCom 2023, Corfu Island, Greece, October 4-6, 2023, Proceedings, Part III

Research Article

A Novel Deep Federated Learning-Based and Profit-Driven Service Caching Method

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-54531-3_7,
        author={Zhaobin Ouyang and Yunni Xia and Qinglan Peng and Yin Li and Peng Chen and Xu Wang},
        title={A Novel Deep Federated Learning-Based and Profit-Driven Service Caching Method},
        proceedings={Collaborative Computing: Networking, Applications and Worksharing. 19th EAI International Conference, CollaborateCom 2023, Corfu Island, Greece, October 4-6, 2023, Proceedings, Part III},
        proceedings_a={COLLABORATECOM PART 3},
        year={2024},
        month={2},
        keywords={service caching profit maximization popularity prediction caching decisions collaborative mechanism},
        doi={10.1007/978-3-031-54531-3_7}
    }
    
  • Zhaobin Ouyang
    Yunni Xia
    Qinglan Peng
    Yin Li
    Peng Chen
    Xu Wang
    Year: 2024
    A Novel Deep Federated Learning-Based and Profit-Driven Service Caching Method
    COLLABORATECOM PART 3
    Springer
    DOI: 10.1007/978-3-031-54531-3_7
Zhaobin Ouyang1, Yunni Xia1,*, Qinglan Peng2, Yin Li, Peng Chen3, Xu Wang4
  • 1: College of Computer Science
  • 2: School of Artificial Intelligence
  • 3: School of Computer and Software Engineering
  • 4: College of Mechanical and Vehicle Engineering
*Contact email: xiayunni@hotmail.com

Abstract

Service caching is an emerging solution to addressing massive service request in a distributed environment for supporting rapidly growing services and applications. With the explosive increases in global mobile data traffic, service caching over the edge computing architecture, Mobile edge computing (MEC), emerges for alleviating traffic congestion as well as for optimizing the efficiency of task processing. In this manuscript, we propose a novel profit-driven service caching method based on a federated learning model for service prediction and a deep reinforcement learning mode for yielding caching decisions (FPDRD) in an edge environment. The proposed method is temporal service popularity and user preference-aware. It aims to ensure quality of service (QoS) of delivery of cached service while maximizing the profits of network service providers. Experimental results clearly demonstrate that the FPDRD method outperforms traditional methods in multiple aspects.

Keywords
service caching profit maximization popularity prediction caching decisions collaborative mechanism
Published
2024-02-23
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-54531-3_7
Copyright © 2023–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL