期刊文献+

图卷积神经网络在中文对话情感分析中的应用 被引量:3

Application of Graph Convolution Neural Network for Sentiment Analysis of Chinese Dialogue Texts
下载PDF
导出
摘要 针对目前对话文本情感分析中大部分模型忽略说话者情感的相互影响作用这一问题,为了有效识别对话文本中说话者本身表达的情感类别,充分考虑对话者之间的情感因素,提出一种基于图卷积神经网络的对话情感分析方法。首先,使用BiGRU将对话文本进行序列上下文编码,获得话语文本表征;然后,依据说话者对话顺序构造一个有向图,利用图卷积神经网络获取每个话语文本新的文本表征向量;最后,连接得到的两个话语表征向量,采用基于相似度的注意力机制获得最终的话语文本表示,从而进行情感分类。在dailydialog中文语料库上的实验结果表明,BiGRU结合GCN模型的方法相比于CNN和BiLSTM模型,在对话文本情感分类方面的准确率大约提高了15%,且F1值也有明显提高,取得较好的情感分类效果。 Aiming at the problem that most of the models ignore the interaction of the speakers’emotion in the current dialogue text emotion analysis,in order to effectively identify the emotional categories expressed by the speakers themselves in the dialogue texts,this paper fully considers the emotional factors among the interlocutors,and proposes a dialogue emotional analysis method based on the graph convolution neural network.Firstly,BiGRU is used to encode the dialogue text in sequence context to obtain the discourse text representation.Then,a directed graph is constructed according to the order of conversation,and the new text representation vector of each discourse text is obtained by using the graph convolution neural network.Compared with CNN and BiLSTM,the accuracy of the algorithm is improved by about 15%,and the F1 score is also improved,achieving good emotion classification effect.
作者 杨青 朱丽 张亚文 吴涛 YANG Qing;ZHU Li;ZHANG Ya-wen;WU Tao(Department of Computer,Central China Normal University;National Language Resources Monitoring and Research Network Media Center,Wuhan 430079,China)
出处 《软件导刊》 2021年第3期7-12,共6页 Software Guide
基金 国家自然科学基金项目(61532008) 国家重点研发计划项目(2017YFC0909502)。
关键词 中文对话文本 情感分析 图卷积神经网络 双向门控循环单元 Chinese dialogue texts sentiment analysis convolution neural network bidirectional gated recurrent unit
  • 相关文献

参考文献8

二级参考文献63

  • 1朱嫣岚,闵锦,周雅倩,黄萱菁,吴立德.基于HowNet的词汇语义倾向计算[J].中文信息学报,2006,20(1):14-20. 被引量:326
  • 2姚天昉,娄德成.汉语语句主题语义倾向分析方法的研究[J].中文信息学报,2007,21(5):73-79. 被引量:78
  • 3Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model. The Journal of Ma- chine Learning Research, 2003, 3; 1137-1155.
  • 4Mikolov T, Karaficit M, Burget L, et al. Recurrent neural network based language model[C]//Proceed- ings of the llth Annual Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, September 26-30, 2010. 2010. 1045-1048.
  • 5Socher R, Pennington J, Huang E H, et al. Semi-su- pervised recursive autoencoders for predicting senti- ment distributions[C]//Proeeedings of the Conference on Empirical Methods in Natural Language Process- ing. Association for Computational Linguistics, 2011:151-161.
  • 6Hochreiter S, Bengio Y, Frasconi P, et al. Gradient flow in recurrent nets: the difficulty of learning long- term dependencies M. Wiley-IEEE Press, 2001: 237-243.
  • 7Hochreiter S, Schmidhuber J. Long short-term memo- ry. Neural computation, 1997, 9(8): 1735-1780.
  • 8Socher R, Lin C C, Manning C, et al. Parsing natural scenes and natural language with recursive neural net- works[C//Proceedings of the 28th international con- ference on machine learning (ICML-11). 2011 : 129- 136.
  • 9Socher R, Perelygin A, Wu J Y, et al. Recursive deep models for semantic compositionality over a sentiment treebankC//Proceedings of the conference on empiri- cal methods in natural language processing (EMNLP). 2013 : 1631-1642.
  • 10Irsoy O, Cardie C. Deep Recursive Neural Networks for Compositionality in Language[-C//Proeeedings of the Advances in Neural Information Processing Sys- tems. 2014:2096 -2104.

共引文献297

同被引文献24

引证文献3

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部