期刊文献+

耦合Encoder-Decoder的LSTM径流预报模型研究 被引量:9

Research on LSTM runoff forecast model coupled with Encoder-Decoder
原文传递
导出
摘要 将长短期记忆神经网络(long short-term memory neural network,LSTM)与Encoder-Decoder结构耦合应用为LSTM-ED模型,并与LSTM人工智能径流预报模型进行比较。通过在闽江建溪流域进行应用,结果表明,相较于LSTM,LSTM-ED在检验期整体和各预见期具有更高的精度和稳定性,且对于典型洪水的预报洪峰误差更小,其独有的语义向量可以保持水文信息的连续性,预报径流过程不易受降雨波动干扰。2个模型的预报能力都与流域最大汇流时间密切相关,当预见期小于流域最大汇流时间时,2个模型都有很好的预报能力;当预见期大于流域最大汇流时间时,模型预报能力显著变差;当预见期远大于流域最大汇流时间时,2个模型都失去预报可靠性。 In this study, the long short-term memory neural network(LSTM) is coupled with the EncoderDecoder structure to form the LSTM-ED model, and the model is compared with the LSTM. The application results in the Jianxi basin of the Minjiang River show that compared with LSTM, LSTM-ED has higher accuracy and stability both in the overall testing stage and at each forecast horizon, and LSTM-ED has smaller peak flow prediction errors for typical floods. With the unique semantic vector which maintains the continuity of hydrological information, the runoff forecasting process of LSTM-ED is insensitive to fluctuations in the rainfall process. However, the forecast ability of the two models is closely related to the maximum concentration time(MCT) of the basin. When the forecast horizon is shorter than the MCT of the basin, both models have good forecast ability. When the forecast horizon is longer than MCT, the model forecast ability significantly decreases. When the forecast horizon is much longer than MCT, both models lose their forecast reliability.
作者 林康聆 陈华 陈清勇 罗宇轩 刘峰 陈杰 LIN Kangling;CHEN Hua;CHEN Qingyong;LUO Yuxuan;LIU Feng;CHEN Jie(State Key Laboratory of Water Resources and Hydropower Engineering Science,Wuhan University,Wuhan 430072,China;Fujian Hydrographic and Water Resources Survey Centre,Fuzhou 350001,China;School of Computer Science,Wuhan University,Wuhan 430072,China)
出处 《武汉大学学报(工学版)》 CAS CSCD 北大核心 2022年第8期755-761,共7页 Engineering Journal of Wuhan University
基金 国家重点研发计划项目(编号:2019YFC1510703)。
关键词 径流预报 Encoder-Decoder结构 长短期记忆神经网络 深度学习 人工神经网络 runoff forecast Encoder-Decoder structure long short-term memory neural network deep learning artificial neural network
  • 相关文献

参考文献14

二级参考文献105

共引文献374

同被引文献122

引证文献9

二级引证文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部