摘要
随着云计算技术的快速发展,越来越多的用户选择使用云服务。负载请求与资源供应的不匹配问题日益凸显,使得用户请求无法得到及时响应,极大地影响云服务质量,实时预测负载请求,将有助于及时供应资源。针对云计算环境中的负载预测方法性能低的问题,提出了一种基于自适应噪声的完备经验模态分解和卷积长时序神经网络组合模型(CEEMDAN-ConvLSTM)的云计算负载预测方法。首先运用自适应噪声的完备经验模态(CEEMDAN)分解技术对数据序列进行分解操作,将其转换为若干个易于分析和建模的子序列;然后运用卷积长时序神经网络(ConvLSTM)预测模型对这一系列子序列进行建模预测,并采用基于多进程并行计算的研究思路,实现多序列并行预测及贝叶斯优化调参;最后将预测值综合叠加以获得整个模型的预测输出,从而实现对原始复杂序列数据进行高精度预测的目标。使用Google集群工作负载数据集进行实验验证,实验结果表明,CEEMDAN-ConvLSTM组合模型具有良好的预测效果,相比自回归差分移动平均模型(ARIMA)、长短期记忆网络(LSTM)和卷积长时序神经网络(ConvLSTM),所提模型预测均方根误差(RMSE)指标分别提升了30.9%,30.1%和22.5%。
With the rapid development of cloud computing technology,more and more users choose to use cloud services,and the problem of mismatch between load requests and resource supply becomes increasingly prominent.As a result,user requests cannot be timely responded,which greatly affects the cloud service quality.Real-time prediction of load requests will help the timely supply of resources.To solve the problem of low performance of load prediction methods in the cloud computing environment,a cloud computing load prediction method based on hybrid model of complete ensemble empirical mode decomposition with adaptive noise and convolutional long short-term memory(CEEMDAN-ConvLSTM)is proposed.To begin with,the data sequence is decomposed into several sub-sequences which are easy to analyze and model.Then the convolutional long short-term memory(ConvLSTM)prediction model is used to predict the series of sub-sequences.The research idea based on multi-process parallel computation is adopted to realize multi-sequence parallel prediction and Bayesian optimization parameter tuning.Finally,the prediction values are integrated and superimposed to obtain the prediction output of the whole model,to achieve the goal of high-precision prediction of the original complex sequence data.The CEEMDAN-ConvLSTM hybrid model is verified by using the Google cluster workload data set.Experiment results show that the CEEMDAN-ConvLSTM hybrid model had a good prediction effect.Compared with the autoregressive differential moving average model(ARIMA),long short-term memory network(LSTM)and the convolutional long short-term memory(ConvLSTM),the Root Mean Square Error(RMSE)increases by 30.9%,30.1%and 22.5%,respectively.
作者
赵鹏
周建涛
赵大明
ZAHO Peng;ZHOU Jiantao;and ZHAO Daming(College of Computer Science,Inner Mongolia University,Hohhot 010021,China;National&Local Joint Engineering Research Center of Intelligent Information Processing Technology for Mongolian,Hohhot010021,China;Engineering Research Center of Ecological Big Data,Ministry of Education,Hohhot 010021,China;Inner Mongolia Engineering Laboratory for Cloud Computing and Service Software,Hohhot 010021,China;Inner Mongolia Key Laboratory of Social Computing and Data Processing,Hohhot 010021,China;Inner Mongolia Engineering Laboratory for Big Data Analysis Technology,Hohhot 010021,China;Inner Mongolia Key Laboratory of Discipline Inspection and Supervision Big Data,Hohhot 010021,China)
出处
《计算机科学》
CSCD
北大核心
2023年第S01期642-650,共9页
Computer Science
基金
国家自然科学基金(62162046)
内蒙古科技攻关项目(2021GG0155)
内蒙古自然科学基金重大项目(2019ZD15)
内蒙古自然科学基金(2019GG372)。