期刊文献+

BERT和LSI的端到端方面级情感分析模型

End-to-End Aspect-Based Sentiment Analysis Model for BERT and LSI
下载PDF
导出
摘要 针对现有基于端到端方面的情感分析(E2E-ABSA)方法研究中没有充分利用文本信息的不足,提出了一种基于BERT与融合词性、句法信息(lexical and syntactic information,LSI)的模型LSI-BERT。使用BERT嵌入层和TFM特征提取器来提取语义信息,并通过工业级自然语言处理工具SpaCy提取词性信息,引入两个权重因子α和β对语义与词性信息进行融合;采用图注意网络(graph attention networks,GAT)根据句法依存树生成的邻接矩阵进行句法依存信息的提取;利用双流注意力网络针对句法依存信息和融合了词性信息的文本信息进行融合,使这两种信息实现更好的交互。实验结果表明,模型在三个常用基准数据集上的性能优于当前代表模型。 A model LSI-BERT based on BERT and fused lexical and syntactic information(LSI)is proposed to address the shortcomings of the existing end-to-end aspect-based sentiment analysis(E2E-ABSA)method research that does not fully utilize textual information.A BERT embedding layer and a TFM feature extractor are used to extract semantic information,and lexical information is extracted by the industrial-grade natural language processing tool SpaCy.Two weighting factorsαandβare introduced to fuse semantic and lexical information.Graph attention networks(GAT)is used to extract syntactic dependency information based on the adjacency matrix generated from the syntactic dependency tree.A dual-stream attention network is used to fuse syntactic dependency information and textual information fused with lexical information to achieve better interaction between these two types of information.The experimental results show that the model outperforms the current representative model on three commonly used benchmark datasets.
作者 代佳梅 孔韦韦 王泽 李佩哲 DAI Jiamei;KONG Weiwei;WANG Ze;LI Peizhe(School of Computer Science and Technology,Xi’an University of Posts and Telecommunications,Xi’an 710121,China;Shaanxi Provincial Key Laboratory of Network Data Analysis and Intelligent Processing,Xi’an University of Posts and Telecommunications,Xi’an 710121,China)
出处 《计算机工程与应用》 CSCD 北大核心 2024年第12期144-152,共9页 Computer Engineering and Applications
关键词 端到端 基于方面的情感分析 图注意网络 权重因子 双流注意力网络 end-to-end aspect-based sentiment analysis graph attention networks(GAT) weighting factors dual-stream attention networks
  • 相关文献

参考文献5

二级参考文献6

共引文献6

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部