About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
ew 24(1):

Research Article

Revolutionizing Cloud Resource Allocation: Harnessing Layer-Optimized Long Short-Term Memory for Energy-Efficient Predictive Resource Management

Download114 downloads
Cite
BibTeX Plain Text
  • @ARTICLE{10.4108/ew.6505,
        author={Prathigadapa Sireesha and Vishnu Priyan S and M Govindarajan and Sounder Rajan and V Rajakumareswaran},
        title={Revolutionizing Cloud Resource Allocation: Harnessing Layer-Optimized Long Short-Term Memory for Energy-Efficient Predictive Resource Management},
        journal={EAI Endorsed Transactions on Energy Web},
        volume={11},
        number={1},
        publisher={EAI},
        journal_a={EW},
        year={2024},
        month={7},
        keywords={Long-short-term memory, cloud computing, energy-effecient resources, resources forecasting},
        doi={10.4108/ew.6505}
    }
    
  • Prathigadapa Sireesha
    Vishnu Priyan S
    M Govindarajan
    Sounder Rajan
    V Rajakumareswaran
    Year: 2024
    Revolutionizing Cloud Resource Allocation: Harnessing Layer-Optimized Long Short-Term Memory for Energy-Efficient Predictive Resource Management
    EW
    EAI
    DOI: 10.4108/ew.6505
Prathigadapa Sireesha1,*, Vishnu Priyan S2, M Govindarajan3, Sounder Rajan4, V Rajakumareswaran4
  • 1: Asia Pacific University of Technology & Innovation
  • 2: Kings Engineering College
  • 3: Velalar College of Engineering and Technology
  • 4: Erode Sengunthar Engineering College
*Contact email: sireesha.prathi@apu.edu.my

Abstract

INTRODUCTION: This is the introductory text. Accurate data center resource projection will be challenging due to the dynamic and constantly changing workloads of multi-tenant co-hosted applications. Resource Management in the Cloud (RMC) becomes a significant research component. In the cloud's easy service option, users can choose to pay a fixed sum or based on the amount of time. OBJECTIVES: The main goal of this study is systematic method for estimating future cloud resource requirements based on historical consumption. Resource distribution to users, who require a variety of resources, is one of cloud computing main objective in this study. METHODS: This article suggests a Layer optimized based Long Short-Term Memory (LOLSTM) to estimate the resource requirements for upcoming time slots. This model also detects SLA violations when the QoS value exceeds the dynamic threshold value, and it then proposes the proper countermeasures based on the risk involved with the violation. RESULTS: Results indicate that in terms of training and validation the accuracy is 97.6%, 95.9% respectively, RMSE and MAD shows error rate 0.127 and 0.107, The proposed method has a minimal training and validation loss at epoch 100 are 0.6092 and 0.5828, respectively. So, the suggested technique performed better than the current techniques. CONCLUSION: In this work, the resource requirements for future time slots are predicted using LOLSTM technique. It regularizes the weights of the network and avoids overfitting. In addition, the proposed work also takes necessary actions if the SLA violation is recognized by the model. Overall, the proposed work in this study shows better performance compared to the existing methods.

Keywords
Long-short-term memory, cloud computing, energy-effecient resources, resources forecasting
Received
2023-12-26
Accepted
2024-06-27
Published
2024-07-03
Publisher
EAI
http://dx.doi.org/10.4108/ew.6505

Copyright © 2024 P. Sireesha et al., licensed to EAI. This is an open access article distributed under the terms of the CC BY-NC-SA 4.0, which permits copying, redistributing, remixing, transformation, and building upon the material in any medium so long as the original work is properly cited.

EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL