期刊文献+

融合方向感知Transformer的目标情感分析 被引量:1

Target-based Sentiment Analysis of Fusion Direction-aware Transformer
下载PDF
导出
摘要 基于目标的情感分析(Target-Based Sentiment Analysis)是情感分析领域最具有挑战性的课题之一,需要同时解决目标提取和特定目标情感分析两个子任务.现有研究工作仍存在两个问题:第一,模型无法充分利用目标边界和情感信息;第二,普遍采用长短期记忆网络提取特征,无法捕抓输入句子的内部关系.为了解决上述问题,本文通过引入方向感知的Transformer,提出一种基于双辅助网络的目标情感分析模型DNTSA(Dual-assist Network based model for Target Sentiment Analysis),其核心思想是使用方向感知的Transformer作为特征提取器有效对齐多个目标词和情感词的内在联系,通过双辅助网络进一步增强模型的情感识别和目标提取能力.本文提出的方法在Laptop,Restaurant,Twitter 3个公开数据集上对比基准方法E2E-TBSA分别提升了2.3%,1.8%,3.9%的F1值. Target-Based Sentiment Analysis(TBSA)is one of the most challenging topics in the field of sentiment analysis,which needs to address two subtasks of target extraction and target-specific sentiment analysis simultaneously.Existing research work still suffers from two problems:first,the models cannot fully utilize the target boundaries and sentiment information;second,they commonly use Long Short-Term Memory networks to extract features,which cannot capture the internal relationships of the input sentences.In order to solve the above problems,this paper proposes a Dual-assist Network based model for Target Sentiment Analysis(DNTSA)by introducing the direction-aware Transformer,whose core idea is to use the direction-aware Transformer as a feature extractor to effectively align the intrinsic connections of multiple target words and sentiment words,and further enhance the sentiment recognition and target extraction ability of the model through the dual-assist network.The proposed method improves the F1 values by 2.3%,1.8%,and 3.9% on three publicly available datasets,Laptop,Restaurant,and Twitter,respectively,compared with the baseline method E2 E-TBSA.
作者 蔡瑞初 尹婉 许柏炎 CAI Rui-chu;YIN Wan;XU Bo-yan(School of Computer Science,Guangdong University of Technology,Guangzhou 510006,China)
出处 《小型微型计算机系统》 CSCD 北大核心 2022年第11期2285-2292,共8页 Journal of Chinese Computer Systems
基金 国家自然科学基金项目(61876043,61976052)资助。
关键词 目标情感分析 TRANSFORMER 文本表示 多任务学习 注意力机制 target-based sentiment analysis Transformer text representation multi-task learning attention mechanism
  • 相关文献

参考文献3

二级参考文献20

共引文献27

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部