期刊文献+

基于Self-Attention的方面级情感分析方法研究

Research on aspect-level sentiment analysis method based on Self-Attention
下载PDF
导出
摘要 针对传统模型在细粒度的方面级情感分析上的不足,如RNN会遇到长距离依赖的问题,且模型不能并行计算;CNN的输出通常包含池化层,特征向量经过池化层的运算后会丢失相对位置信息和一些重要特征,且CNN没有考虑到文本的上下文信息。本文提出了一种Light-Transformer-ALSC模型,基于Self-Attention机制,且运用了交互注意力的思想,对方面词和上下文使用不同的注意力模块提取特征,细粒度地对文本进行情感分析,在SemEval2014 Task 4数据集上的实验结果表明本文模型的效果优于大部分仅基于LSTM的模型。除基于BERT的模型外,在Laptop数据集上准确率提高了1.3%~5.3%、在Restaurant数据集上准确率提高了2.5%~5.5%;对比基于BERT的模型,在准确率接近的情况下模型参数量大大减少。 To address the shortcomings of RNN and CNN-based models for fine-grained aspect-level sentiment analysis,which are RNNs encounter the problem of long-distance dependency and the models cannot be computed in parallel;meanwhile,there exist the following problems that the output of CNNs usually contains pooling layers,and the feature vectors lose relative position information and some important features after the operation of pooling layers,and CNNs do not take into account the contextual information of the text.This paper proposes a Light-Transformer-ALSC model,based on the Self-Attention mechanism,and uses the idea of interactive attention to extract features using different attention modules for aspect words and contexts to fine-grained sentiment analysis of the text.The model in the paper outperforms most of the LSTM-only based models.In addition to the BERT-based model,the accuracy is improved by 1.3%~5.3%on the Laptop dataset and 2.5%~5.5%on the Restaurant dataset;compared with the BERT-based model,the number of model parameters is greatly reduced with a similar accuracy.
作者 蔡阳 CAI Yang(School of Computer Science and Technology,Zhejiang Sci-Tech University,Hangzhou 310018,China)
出处 《智能计算机与应用》 2023年第8期150-154,157,共6页 Intelligent Computer and Applications
关键词 方面级情感分析 Self-Attention TRANSFORMER SemEval-2014 Task 4 BERT aspect-level sentiment analysis Self-Attention Transformer SemEval-2014 Task 4 BERT
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部