期刊文献+

基于自注意力机制的PM_(2.5)长时间尺度预测

Long-term PM_(2.5) Concentrations Forecasting Based on Self-attention
下载PDF
导出
摘要 近些年基于机器学习的PM_(2.5)预测逐渐成为主流,具有较强的非线性建模能力和提高预测精度的优势。然而,时间跨度较大的PM_(2.5)浓度变化预测仍然面临挑战。文章构建了多种自注意力机制的模型,将PM_(2.5)浓度的逐日预测提升到了14天的尺度,提升了以日为单位的PM_(2.5)预测精度。并对Informer、Autoformer、FEDformer和TCN模型在以日为单位的长时间尺度预测进行了对比分析,提高了PM_(2.5)预测模型的准确性和可靠性。文章共构建了3,7,14天三个时间尺度,在各个时间尺度上,Autoformer模型性能表现都是最好的。相较于TCN模型,Autoformer在预测未来3天的时间尺度上,RMSE优化了43.36%,MAE优化了42.70%。在7天的时间尺度上,RMSE优化了39.07%,MAE优化了8.98%、在14天的时间尺度上,RMSE优化了39.07%,MAE优化了8.98%。有效提升了PM_(2.5)在长时间序列预测上的精度。 In recent years,machine learning-based PM_(2.5) prediction has gradually become main-stream,with strong nonlinear modeling capabilities and advantages in improving prediction accu-racy.However,predicting PM_(2.5) concentration changes over a large time span still faces challen-ges,This study constructed multiple models with self-attention mechanisms,extending the daily prediction of PM_(2.5) concentration to a scale of 14 days and improving the prediction accuracy of PM_(2.5) on a daily basis.Comparative analysis of the Informer,Autoformer,FEDformer,and TCN models was conducted for long-term daily PM_(2.5) concentration prcdiction,cnhancing the accuracy and reliability of PM_(2.5) prediction models.Three time scales were constructed in this study:3 days,7 days,and 14 days.The Autoformer model performed the best at each time scale.Compared to the TCN model,the Autoformer model showed significant performance improvc-ments in predicting thc future 3-day timc scale,with RMSE and MAE reductions of 43.36% and 42.70%,respectively.At the 7-day time scale,RMSE improved by 39.07%,and MAE im-proved by 8.98%.At the 14-day time scale,RMSE improved by 39.07%,and MAE improved by 8.98%.This study effectively improved the accuracy of PM_(2.5) prediction in long-term time series forecasting.
作者 何宇涵 HE Yuhan(School of Electronic Infomation of Wuhan University,Wuhan 430000,China)
出处 《长江信息通信》 2024年第10期72-75,共4页 Changjiang Information & Communications
关键词 PM_(2.5) 长时间序列预测 自注意力机制 Autoformer TCN PM_(2.5) LSTF self-attention Autoformer
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部