摘要
针对注意力机制与卷积神经网络模型在方面级情感分析研究中,无法发掘句中长距离单词与相关句法约束间依存关系,而将与语法无关的上下文单词作为方面情感判断线索的问题,该文提出了一种结合图卷积网络(GCN)和注意-过度注意(AOA)神经网络的方面级情感分类模型(ASGCN-AOA)。首先,采用双向长短时记忆网络来对上下文词之间特定于方面的表示进行建模;其次,在每个句子的依赖树上,建立相应图形卷积网络(GCN),得到同时考虑句法依赖性和远距离多词关系的方面特征;最后,通过AOA注意力机制,捕获方面词与上下文句子之间的交互和表示,自动关注句子重要部分。在五个数据集Twitter、Lap14、Rest14、Rest15和Rest16上进行实验,采用Accuracy和Macro-F1指标进行评估。实验结果表明,该文模型与其他基于方面分析算法相比有较明显提升。
Aiming at the problem that attention mechanism and convolutional neural network model cannot explore the dependencies between long-distance words and related syntactic constraints in a sentence in aspect-level sentiment analysis research,and the context words unrelated to grammar are used as aspect sentiment judgment clues.This paper proposes an aspect-level emotion classification model(ASGCN-AOA)combining graph convolution network(GCN)and attention-over-attention(AOA)neural network.Firstly,a bidirectional long short-term memory network is used to model the aspect-specific representations between context words.Then,in the dependency tree of each sentence,the corresponding graph convolution network(GCN)is established to obtain the aspect feature of considering syntactic dependence and long-distance multi-word relationship simultaneously.Finally,the AOA attention mechanism captures the interaction and representation between aspect words and context sentences,and automatically pays attention to important parts of sentences.Experiments were carried out on five data sets:Twitter,Lap14,Rest14,Rest15 and Rest16.Accuracy and Macro-F1 indicators were used to evaluate.Experimental results show that the model presented in this paper is significantly improved compared with other related aspect-based analysis algorithms.
作者
夏鸿斌
顾艳
刘渊
XIA Hongbin;GU Yan;LIU Yuan(School of Artificial Intelligence and Computer,Jiangnan University,Wuxi,Jiangsu 214122,China;Jiangsu Key Laboratory of Media Design and Software Technology,Wuxi,Jiangsu 214122,China)
出处
《中文信息学报》
CSCD
北大核心
2022年第3期146-153,共8页
Journal of Chinese Information Processing
基金
国家自然科学基金(61672264)。
关键词
自然语言理解
图卷积网络
长短时记忆网络
注意-过度注意神经网络
人工智能
natural language understanding
graph convolution network
long short-term memory network
attention-over-attention neural network
artificial intelligence