期刊文献+

融合Transformer和交互注意力网络的方面级情感分类模型

Aspect-level sentiment classification model combining Transformer and interactive attention network
下载PDF
导出
摘要 现有的大多数研究者使用循环神经网络与注意力机制相结合的方法进行方面级情感分类任务。然而,循环神经网络不能并行计算,并且模型在训练过程中会出现截断的反向传播、梯度消失和梯度爆炸等问题,传统的注意力机制可能会给句子中重要情感词分配较低的注意力权重。针对上述问题,该文提出了一种融合Transformer和交互注意力网络的方面级情感分类模型。首先利用BERT(bidirectional encoder representation from Transformers)预训练模型来构造词嵌入向量,然后使用Transformer编码器对输入的句子进行并行编码,接着使用上下文动态掩码和上下文动态权重机制来关注与特定方面词有重要语义关系的局部上下文信息。最后在5个英文数据集和4个中文评论数据集上的实验结果表明,该文所提模型在准确率和F1上均表现最优。 At present,most researchers use a combination of recurrent neural networks and attention mechanisms for aspect-level sentiment classification tasks.However,the recurrent neural network cannot be computed in parallel,and the models encounter problems,such as truncated backpropagation,gradient vanishing,and gradient exploration,in the training process.Traditional attention mechanisms may assign reduced attention weights to important sentiment words in sentences.An aspect-level sentiment classification model combining Transformer and interactive attention network is proposed to solve these problems.In this approach,the pretrained model,which considers bidirectional encoder representation from Transformers(BERT),is initially used to construct word embedding vectors.Then,Transformer encoders are used to perform parallel encoding for input sentences.Subsequently,the contextual dynamic mask-off code and the contextual dynamic weighting mechanisms are applied to focus on local context information semantically relevant to specific aspect words.Finally,the model is tested on five English datasets and four Chinese review datasets.Experimental results demonstrate that the proposed model outperforms others in terms of accuracy and F1.
作者 程艳 胡建生 赵松华 罗品 邹海锋 詹勇鑫 富雁 刘春雷 CHENG Yan;HU Jiansheng;ZHAO Songhua;LUO Pin;ZOU Haifeng;ZHAN Yongxin;FU Yan;LIU Chunlei(School of Computer Information Engineering,Jiangxi Normal University,Nanchang 330022,China;Key Laboratory of Intelligent Information Processing and Emotional Computing in Jiangxi Province,Nanchang 330022,China;Jiangxi Ruanyun Technology Corporation Limited,Nanchang 330200,China;Jiangxi Heyi Technology Co.,Ltd.,Nanchang 330200,China)
出处 《智能系统学报》 CSCD 北大核心 2024年第3期728-737,共10页 CAAI Transactions on Intelligent Systems
基金 国家自然科学基金项目(62167006,61967011) 江西省科技创新基地-智能信息处理与情感计算江西省重点实验室(原江西省智能教育省重点实验室)项目(20212BCD42001) 江西省03专项及5G项目(20212ABC03A22) 江西省主要学科学术和技术带头人培养计划-领军人才项目(20213BCJL22047) 江西省自然科学基金项目(20212BAB202017).
关键词 方面词 情感分类 循环神经网络 TRANSFORMER 交互注意力网络 BERT 局部特征 深度学习 aspect term sentiment classification recurrent neural network transformer interactive attention network BERT local feature deep learning
  • 相关文献

参考文献2

二级参考文献2

共引文献43

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部