期刊文献+

基于注意力的时空神经网络城市区域交通流量预测 被引量:7

Predicting citywide traffic flow using attention-based spatial-temporal neural network
下载PDF
导出
摘要 可靠的交通流量预测在交通管理和公共安全方面具有重要意义。然而,这也是一件具有挑战性的任务,因为它易受到空间依赖性、时间依赖性以及一些额外因素(天气和突发事件等)的影响。现有的大部分工作只考虑了交通数据的部分属性,导致建模不充分,预测性能不理想。因此,提出了一种新的端到端的深度学习模型——时空注意力卷积长短期记忆网络(ST-AttConvLSTM),用于交通流量的预测。ST-AttConvLSTM将整个模型分为三个分支进行建模,每个分支经过残差神经网络提取局部的空间特征,同时进一步结合天气等外部因素,再利用卷积长短时记忆网络(ConvLSTM)和注意力模型两种组件来挖掘流量的潜在规律,捕获时空维度上数据的关联性。使用北京市和纽约市两个真实的移动数据集来评估提出的方法,实验结果表明,该方法比知名的基准方法有更高的预测精度。 Reliable traffic flow prediction is of great significance in traffic management and public safety.However,this is also a challenging task because it is affected by spatial dependencies,temporal dependencies and some additional factors(weather and emergencies,etc.).Most existing works can only consider part of the attributes of traffic data,resulting in insufficient modeling and unsatisfactory prediction performance.This paper proposed a novel end-to-end deep learning model,called spatio-temporal attention ConvLSTM(ST-AttConvLSTM),for traffic flow prediction.ST-AttConvLSTM was divided into three branches for modeling.For each branch,the residual neural network was used to extract local spatial features,and external factors were also combined with them.Then,it employed two components consisting of ConvLSTM and attention model to discover the potential relationship of traffic flow,and capture the correlations of data in both spatial and temporal dimensions.It used two real trips data sets in Beijing and New York to evaluate the proposed method.The experimental results show that this method achieves higher prediction accuracy than well-known baselines.
作者 廖挥若 杨燕 Liao Huiruo;Yang Yan(School of Computing&Artificial Intelligence,Southwest Jiaotong University,Chengdu 611756,China;Key Laboratory of Sichuan Pro-vince for Cloud Computing&Intelligent Technology,Chengdu 611756,China)
出处 《计算机应用研究》 CSCD 北大核心 2021年第10期2935-2940,共6页 Application Research of Computers
基金 国家自然科学基金资助项目(61976247)。
关键词 交通流量预测 深度学习 卷积长短时记忆网络 注意力模型 traffic flow prediction deep learning ConvLSTM attention model
  • 相关文献

参考文献1

共引文献6

同被引文献67

引证文献7

二级引证文献10

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部