期刊文献+

基于BERT和超图对偶注意力网络的文本情感分析

Text sentiment analysis based on BERT and hypergraph with dual attention network
下载PDF
导出
摘要 针对网络短文本存在大量的噪声和缺乏上下文信息的问题,提出一种基于BERT和超图对偶注意力机制的文本情感分析模型。首先利用BERT预训练模型强大的表征学习能力,对情感文本进行动态特征提取;同时挖掘文本的上下文顺序信息、主题信息和语义依存信息将其建模成超图,通过对偶图注意力机制来对以上关联信息进行聚合;最终将BERT和超图对偶注意力网络两个模块提取出的特征进行拼接,经过softmax层得到对文本情感倾向的预测结果。该模型在电商评论二分类数据集和微博文本六分类数据集上的准确率分别达到95.49%和79.83%,相较于基准模型分别提高2.27%~3.45%和6.97%~11.69%;同时还设计了消融实验验证模型各部分对分类结果的增益。实验结果表明,该模型能够显著提高针对中文网络短文本情感分析的准确率。 To address the problems of large amount of noise and lack of contextual information in short texts on the Web,this paper proposed a text sentiment analysis model based on BERT and hypergraph with dual attention mechanism.This method firstly utilized BERT for dynamic feature extraction of sentiment texts.Meanwhile it mined the contextual,topic and semantic dependency information of the text to model it into a hypergraph,and then aggregated the above information through the dual graph attention mechanism.Finally,it spliced the features extracted by BERT and hypergraph with dual attention network,and obtained the prediction result after softmax layer.The accuracy of this model on the e-commerce review dataset and the Microblog text dataset reaches 95.49%and 79.83%respectively,which is 2.27%~3.45%and 6.97%~11.69%higher than the baselines,respectively.The experimental results show that the model can significantly improve the accuracy of sentiment analysis for Chinese Web short texts.
作者 胥桂仙 刘兰寅 王家诚 陈哲 Xu Guixian;Liu Lanyin;Wang Jiacheng;Chen Zhe(Key Laboratory of Ethnic Language Intelligent Analysis&Security Governance of MOE,Minzu University of China,Beijing 100081,China;School of Information Engineering,Minzu University of China,Beijing 100081,China)
出处 《计算机应用研究》 CSCD 北大核心 2024年第3期786-793,共8页 Application Research of Computers
基金 国家社会科学基金资助项目(19BGL241)。
关键词 文本情感分析 超图 图分类 注意力机制 text sentiment analysis hypergraph graph classification attention mechanism
  • 相关文献

参考文献5

二级参考文献31

共引文献42

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部