期刊文献+

基于改进的LSTM算法的时间序列流量预测 被引量:10

Prediction of time series traffic based on improved LSTM algorithm
下载PDF
导出
摘要 时间序列流量的预测问题是近年来机器学习的一个热点问题,通过改变长短期记忆网络(LSTM)层数、网络层神经元的个数、网络层之间的连接方式,特殊网络层的应用等网络结构以及优化器和损失函数的选择可以极大地提高预测的精度.本文提出多层LSTM算法,该算法是在传统LSTM算法上进行改进的单一模型,模型设计的复杂度低,可以提高机器学习的效率.模型采用一个输入层、5个隐藏层、1个输出层,同时包含1个全连接层和1个Dropout层,Dropout层的作用是防止机器学习过拟合.选择adam为模型优化器、mlse为模型损失函数、relu作为模型的激活函数.实验结果表明,与传统模型相比,该模型具有较好的泛化能力. The prediction of time series traffic is a hot issue in machine learning in recent years.It has been found that the prediction accuracy can be greatly improved by approaches of changing the network structure(such as the number of neural network layers,the number of neurons in network layers,the connection mode between network layers,as well as the application of special network layers),and selecting appropriate optimizer and loss function.Here,we propose a multi-layer LSTM(Long Short-Term Memory)algorithm,which is a single model improved on traditional LSTM algorithm,to reduce the model s complexity and improve the efficiency of machine learning.The model includes an input layer,five hidden layers,an output layer,a full connection layer,and also a dropout layer to prevent the machine learning from over-fitting.The model uses adam as optimizer,mlse as loss function,and relu as activation function.The experimental results show that the proposed model has better generalization ability compared with traditional LSTM model.
作者 郭佳丽 邢双云 栾昊 贾艳婷 GUO Jiali;XING Shuangyun;LUAN Hao;JIA Yanting(School of Sciences,Shenyang Jianzhu University,Shenyang 110168;Teaching Affairs Department,Shenyang Jianzhu University,Shenyang 110168)
出处 《南京信息工程大学学报(自然科学版)》 CAS 北大核心 2021年第5期571-575,共5页 Journal of Nanjing University of Information Science & Technology(Natural Science Edition)
基金 国家自然科学基金(61803275) 辽宁省“兴辽英才计划”项目(XLYC1907044) 辽宁省自然科学基金(2020-MS-218) 辽宁省教育厅重点项目(lnzd202007)。
关键词 时间序列预测 长短期记忆网络 预测精度 泛化能力 time series prediction long short-term memory(LSTM) prediction accuracy generalization ability
  • 相关文献

参考文献4

二级参考文献18

共引文献89

同被引文献110

引证文献10

二级引证文献14

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部