摘要
车站货运量短期预测,有助于车站和调度部门提前了解运量变化趋势,调整运输资源安排,提高运输组织效率。选取国家能源集团铁路货运车站作为研究对象,以车站为图网络节点,将车站物理相邻关系、运单需求关系和列车开行关系抽象成节点之间的异质边,构建基于异质时空图注意力网络的货运量预测模型。模型在单个图网络中利用图注意力机制捕捉车站与其邻居之间的空间关联性,通过异质节点特征融合机制实现3个子图间的信息融合,处理得到的空间特征输入循环门控单元以更新时序特征。选取国家能源集团铁路各车站实际货运量数据进行实验,结果证明提出的模型预测效果更加准确,能够有效辅助调度统计工作。
The short-term prediction of station freight volume helps stations and dispatching departments to understand the trend of volume changes in advance,adjust the allocation of transportation resources,and improve transportation organization efficiency.The railway freight stations of the National Energy Group were focused and a freight volume prediction model was constructed based on the heterogeneous spatialtemporal graph attention network in this study.In the graph network,the stations were treated as nodes,whereas the physical adjacency relationships,the waybill demand relationships,and the train operation relationships between stations were abstracted as heterogeneous edges between nodes.The model utilized graph attention mechanisms to capture the spatial correlations between stations and their neighbors within a single graph network and used heterogeneous node feature fusion mechanisms to integrate information among three sub-graphs The obtained spatial features were then put into Gated Recurrent Unit network to update time-series features.Actual freight volume data from various railway stations of the National Energy Group were selected for experimentation,and the results demonstrate that the proposed model is more accurate in prediction and can effectively assist in scheduling and statistical work.
作者
张海山
王文斌
周瑾
ZHANG Haishan;WANG Wenbin;ZHOU Jin(Department of Coal Transportation,China Shenhua Energy Limited,Beijing 100040,China;Institute of Communication and Information Technology,CRSC Research&Design Institute Group Co.,Ltd.,Beijing 100070,China)
出处
《铁道货运》
2024年第6期52-59,共8页
Railway Freight Transport
关键词
重载铁路
车站货运量
时空图注意力网络
时序预测
注意力机制
Railway Freight
Station Freight Volume
Spatial-Temporal Graph Attention Network
Timeseries Prediction
Attention Mechanism