期刊文献+

DA-Transformer:基于“门”注意力的文本情感分析方法

DA-Transformer:Text Sentiment Analysis Method Based on"Gate"Attention
下载PDF
导出
摘要 为了解决Transformer对中文文本词语建模时容易造成信息冗余这一问题,提出了一种“门”注意力结合Transformer(DA-Transformer)的情感分析模型.该模型通过在Transformer模型中的编码和解码过程中插入一种基于自注意力的“门”注意力(DA)来建立文本的长远距离依赖,加速模型学习深层特征与浅层特征的权重比值.本模型在ChnSentiCorp_htl_al和weibo_senti数据集上得到验证.实验表明,本模型的准确率比BLSTM的准确率高1.8%,比BLSTM-Attention模型的准确率高0.9%,表明本模型具有一定的优异性与可实行性. In order to solve the problem that Transformer is easy to cause information redundancy when modeling Chinese text words,a sentiment analysis model combining"gate"attention with Transformer(DA Transformer)is proposed in this paper.This model establishes the long-term distance dependence of text by inserting a self attention based"gate"attention(DA)during the encoding and decoding process in the Transformer model,accelerating the model’s learning of the weight ratio between deep and shallow features.This model has been validated on the ChnSentiCorp_htl_al and weibo_senti datasets.The experiment shows that the accuracy of this model is 1.8%higher than that of BLSTM,and 0.9%higher than that of BLSTM-Attention model,indicating that this model has certain advantages and feasibility.
作者 李苗 关力 张扬 LI Miao;GUAN Li;ZHANG Yang(School of Big data and AI,Anhui Xinhua University,Hefei 230088,China;School of Computer and Telecommunications Engineering,Dalian Jiaotong University,Dalian 116000,China)
出处 《西安文理学院学报(自然科学版)》 2023年第4期35-39,共5页 Journal of Xi’an University(Natural Science Edition)
基金 2021年安徽省质量工程重大教研项目——应用型本科高校人工智能专业群建设与改革研究(2021jyxm0616)。
关键词 文本情感分析 Transformer模型 信息冗余 自注意力机制 DA text sentiment analysis Transformer model information redundancy self-attention mechanism DA
  • 相关文献

参考文献4

二级参考文献24

共引文献66

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部