期刊文献+

基于多尺度分段的长时间序列预测方法

Long sequence time-series forecasting method based on multi-scale segmentation
下载PDF
导出
摘要 针对目前长时间序列预测(long sequence time-series forecasting,LSTF)存在历史数据量大、计算复杂度高、预测精度要求高等问题,提出一种基于多尺度分段的Transformer模型.该模型基于Transformer架构进行改进和优化,使用多尺度分段将时间序列切片成多个时间段进行训练和预测,降低了长时间序列的复杂性,并实现了更高精度的预测.在电力变压器油温(electricity transformer temperature,ETT)数据集、用电负荷(electricity consumption load,ECL)数据集和天气(Weather)数据集中,分别采用传统Transfomer、Informer、门控循环单元(gated recurrent unit,GRU)、时序卷积网络(temporal convolutional network,TCN)和长短期记忆(long short-term memory,LSTM)5种基准模型与本研究提出的多尺度分段的Transformer模型,对长时间序列进行预测.结果表明,采用基于多尺度分段的Transformer模型在Weather数据集上对预测长度为192的时间序列预测的均方误差和平均绝对误差分别为0.367和0.407,均优于其他模型.基于多尺度分段的Transformer模型可以综合Transformer模型的优点,且计算速度更快,预测性能更高. Addressing challenges posed by large volumes of historical data,high computational complexity and stringent prediction accuracy requirements in long sequence time-series forecasting,we propose a Transformer model incorporating the concept of multi-scale segmentation.The model enhances the Transformer architecture by employing multi-scale segmentation to slice the time series into multiple time periods for training and prediction,thereby reducing the complexity of long time series and improving prediction accuracy.Experimental results on the real-world power transformer dataset,encompassing variables like electricity transformer temperature,electricity consumption load,and weather demonstrate that the proposed Transformer model based on the multi-scale segmentation approach outperforms traditional benchmark models such as Transformer,Informer,gated recurrent unit,temporal convolutional network and long short term memory in terms of mean absolute error(MAE)and mean squared error(MSE).The proposed Transformer model achieves an MSE of 0.367 and an MAE of 0.407 in experiments with a prediction length of 192 on the Weather dataset,consistently surpassing other models.By leveraging the advantages of Transformer model and incorporating the multi-scale segmentation approach,the proposed model achieves faster computational speed and superior predictive performance.
作者 何胜林 龙琛 郑静 王爽 文振焜 吴惠思 倪东 何小荣 吴雪清 HE Shenglin;LONG Chen;ZHENG Jing;WANG Shuang;WEN Zhenkun;WU Huisi;NI Dong;HE Xiaorong;WU Xueqing(College of Computer Science and Software Engineering,Shenzhen University,Shenzhen 518060,Guangdong Province,P.R.China;Shenzhen Health Development Research and Data Management Center,Shenzhen 518028,Guangdong Province,P.R.China;Shenzhen Likang Technology Co.Ltd.,Shenzhen 518052,Guangdong Province,P.R.China;Department of Gynaecology and Obsterics,Shenzhen University General Hospital,Shenzhen 518071,Guangdong Province,P.R.China)
出处 《深圳大学学报(理工版)》 CAS CSCD 北大核心 2024年第2期232-240,共9页 Journal of Shenzhen University(Science and Engineering)
基金 国家自然科学基金资助项目(61572328)。
关键词 计算机神经网络 时间序列预测 Transformer模型 多尺度分段 深度学习 电力预测 computer neural networks time series forecasting Transformer model multi-scale segmentation deep learning power forecasting
  • 相关文献

参考文献6

二级参考文献69

共引文献28

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部