期刊文献+

结合Word2vec和BiLSTM的民航非计划事件分析方法

A civil aviation unplanned event analysis method combined with Word2vec and BiLSTM
下载PDF
导出
摘要 安全是民航业的核心主题。针对目前民航非计划事件分析严重依赖专家经验及分析效率低下的问题,文章提出一种结合Word2vec和双向长短期记忆(bidirectional long short-term memory,BiLSTM)神经网络模型的民航非计划事件分析方法。首先采用Word2vec模型针对事件文本语料进行词向量训练,缩小空间向量维度;然后通过BiLSTM模型自动提取特征,获取事件文本的完整序列信息和上下文特征向量;最后采用softmax函数对民航非计划事件进行分类。实验结果表明,所提出的方法分类效果更好,能达到更优的准确率和F 1值,对不平衡数据样本同样具有较稳定的分类性能,证明了该方法在民航非计划事件分析上的适用性和有效性。 Safety is the core theme of the civil aviation industry.Aiming at the problem that the analysis of civil aviation unplanned events heavily depends on expert experience and the low analysis efficiency,a civil aviation unplanned event analysis method is proposed,which combines Word2vec and bidirectional long short-term memory(BiLSTM)neural network model.Firstly,Word2vec is used to train word vectors for event text corpus,reducing the dimension of the space vector,Then,the features are automatically extracted by BiLSTM model to obtain the complete sequence information and context feature vector of the event text.Finally,the softmax function is used to classify civil aviation unplanned events.The experimental results show that the proposed method has better classification effect and can achieve better accuracy and F 1 value,and also has more stable classification performance for unbalanced data samples,which proves the applicability and effectiveness of this method in the analysis of civil aviation unplanned events.
作者 王捷 周迪 左洪福 黄维 WANG Jie;ZHOU Di;ZUO Hongfu;HUANG Wei(College of Civil Aviation,Nanjing University of Aeronautics and Astronautics,Nanjing 211106,China;Chengdu Spaceon Electronics Co.,Ltd.,Chengdu 611731,China)
出处 《合肥工业大学学报(自然科学版)》 CAS 北大核心 2024年第7期917-924,共8页 Journal of Hefei University of Technology:Natural Science
基金 国家自然科学基金联合基金资助项目(U1933202) 民航大NSF重点基金资助项目(U1733201)。
关键词 民航安全 文本分析 非计划事件 Word2vec 双向长短期记忆(BiLSTM)神经网络 civil aviation safety text analysis unplanned event Word2vec bidirectional long short-term memory(BiLSTM)neural network
  • 相关文献

参考文献6

二级参考文献38

  • 1Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model. The Journal of Ma- chine Learning Research, 2003, 3; 1137-1155.
  • 2Mikolov T, Karaficit M, Burget L, et al. Recurrent neural network based language model[C]//Proceed- ings of the llth Annual Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, September 26-30, 2010. 2010. 1045-1048.
  • 3Socher R, Pennington J, Huang E H, et al. Semi-su- pervised recursive autoencoders for predicting senti- ment distributions[C]//Proeeedings of the Conference on Empirical Methods in Natural Language Process- ing. Association for Computational Linguistics, 2011:151-161.
  • 4Hochreiter S, Bengio Y, Frasconi P, et al. Gradient flow in recurrent nets: the difficulty of learning long- term dependencies M. Wiley-IEEE Press, 2001: 237-243.
  • 5Hochreiter S, Schmidhuber J. Long short-term memo- ry. Neural computation, 1997, 9(8): 1735-1780.
  • 6Socher R, Lin C C, Manning C, et al. Parsing natural scenes and natural language with recursive neural net- works[C//Proceedings of the 28th international con- ference on machine learning (ICML-11). 2011 : 129- 136.
  • 7Socher R, Perelygin A, Wu J Y, et al. Recursive deep models for semantic compositionality over a sentiment treebankC//Proceedings of the conference on empiri- cal methods in natural language processing (EMNLP). 2013 : 1631-1642.
  • 8Irsoy O, Cardie C. Deep Recursive Neural Networks for Compositionality in Language[-C//Proeeedings of the Advances in Neural Information Processing Sys- tems. 2014:2096 -2104.
  • 9Li P, Liu Y, Sun M. Recursive Autoencoders for ITG-Based Translation[C]//Proceedings of the EMN- LP. 2013: 567-577.
  • 10Le P, Zuidema W. Inside-Outside Semantics: A Framework for Neural Models of Semantic Composi tlon[C]//Proceeding of the Deep Learning and Rep- resentation Learning Workshop: NIPS 2014.

共引文献165

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部