Your browser doesn't support javascript.
loading
A hybrid cloud load balancing and host utilization prediction method using deep learning and optimization techniques.
Simaiya, Sarita; Lilhore, Umesh Kumar; Sharma, Yogesh Kumar; Rao, K B V Brahma; Maheswara Rao, V V R; Baliyan, Anupam; Bijalwan, Anchit; Alroobaea, Roobaea.
Afiliación
  • Simaiya S; Department of Computer Science and Engineering, Chandigarh University, Gharuan, Mohali, Punjab, 140413, India. saritasimaiya@gmail.com.
  • Lilhore UK; Department of Computer Science and Engineering, Chandigarh University, Gharuan, Mohali, Punjab, 140413, India.
  • Sharma YK; Department of Computer Science and Engineering, Koneru Lakshmaiah Education Foundation, Greenfield, Vaddeswaram, Guntur, AP, India.
  • Rao KBVB; Department: Computer Science and Engineering, Koneru Lakshmaiah Education Foundation, Vaddeswaram, Andhra Pradesh, India.
  • Maheswara Rao VVR; Department of Computer Science and Engineering, Shri Vishnu Engineering College for Women(A), Bhimavaram, India.
  • Baliyan A; Department of Computer Science and Engineering, Chandigarh University, Gharuan, Mohali, Punjab, 140413, India.
  • Bijalwan A; Arba Minch University, Arba Minch, Ethiopia. anchit.bijalwan@amu.edu.et.
  • Alroobaea R; Department of Computer Science, College of Computers and Information Technology, Taif University, P. O. Box 11099, 21944, Taif, Saudi Arabia.
Sci Rep ; 14(1): 1337, 2024 Jan 16.
Article en En | MEDLINE | ID: mdl-38228707
ABSTRACT
Virtual machine (VM) integration methods have effectively proven an optimized load balancing in cloud data centers. The main challenge with VM integration methods is the trade-off among cost effectiveness, quality of service, performance, optimal resource utilization and compliance with service level agreement violations. Deep Learning methods are widely used in existing research on cloud load balancing. However, there is still a problem with acquiring noisy multilayered fluctuations in workload due to the limited resource-level provisioning. The long short-term memory (LSTM) model plays a vital role in the prediction of server load and workload provisioning. This research presents a hybrid model using deep learning with Particle Swarm Intelligence and Genetic Algorithm ("DPSO-GA") for dynamic workload provisioning in cloud computing. The proposed model works in two phases. The first phase utilizes a hybrid PSO-GA approach to address the prediction challenge by combining the benefits of these two methods in fine-tuning the Hyperparameters. In the second phase, CNN-LSTM is utilized. Before using the CNN-LSTM approach to forecast the consumption of resources, a hybrid approach, PSO-GA, is used for training it. In the proposed framework, a one-dimensional CNN and LSTM are used to forecast the cloud resource utilization at various subsequent time steps. The LSTM module simulates temporal information that predicts the upcoming VM workload, while a CNN module extracts complicated distinguishing features gathered from VM workload statistics. The proposed model simultaneously integrates the resource utilization in a multi-resource utilization, which helps overcome the load balancing and over-provisioning issues. Comprehensive simulations are carried out utilizing the Google cluster traces benchmarks dataset to verify the efficiency of the proposed DPSO-GA technique in enhancing the distribution of resources and load balancing for the cloud. The proposed model achieves outstanding results in terms of better precision, accuracy and load allocation.

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Tipo de estudio: Prognostic_studies / Risk_factors_studies Idioma: En Revista: Sci Rep Año: 2024 Tipo del documento: Article País de afiliación: India Pais de publicación: Reino Unido

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Tipo de estudio: Prognostic_studies / Risk_factors_studies Idioma: En Revista: Sci Rep Año: 2024 Tipo del documento: Article País de afiliación: India Pais de publicación: Reino Unido