摘要
文本情感分类是近年来自然语言处理领域的研究热点,旨在对文本蕴含的主观倾向进行分析,其中,基于特定目标的细粒度情感分类问题正受到越来越多的关注。在传统的深度模型中加入注意力机制,可以使分类性能显著提升。针对中文的语言特点,提出一种结合多跳注意力机制和卷积神经网络的深度模型(MHA-CNN)。该模型利用多维组合特征弥补一维特征注意力机制的不足,可以在没有任何先验知识的情况下,获取更深层次的目标情感特征信息。相对基于注意力机制的LSTM网络,该模型训练时间开销更小,并能保留特征的局部词序信息。最后在一个网络公开中文数据集(包含6类领域数据)上进行实验,取得了比普通深度网络模型、基于注意力机制的LSTM模型以及基于注意力机制的深度记忆网络模型更好的分类效果。
Text sentiment classification is a hot topic in the field of natural language processing in recent years.It aims to analyze the subjective sentiment polarity of text.More and more attention has been paid to the problem of fine grained sentiment classification based on specific aspects.In traditional deep models,the attention mechanism can significantly improve the classification performance.Based on the characteristics of Chinese language,a deep model combining multi-hop attention mechanism and convolutional neural network (MHA-CNN) is proposed.The model makes use of the multidimensional combination features to remedy the deficiency of one dimensional feature attention mechanism,and can get deeper aspect sentiment feature information without any prior knowledge.Relative to the attention mechanism based long short-term memory (LSTM) network,the model has smaller time overhead and can retain word order information of the characteristic part.Finally,we conduct experiments on a network open Chinese data set (including 6 kinds of field data),and get better classification results than the ordinary deep network model,the attention-based LSTM model and the attention-based deep memory network model.
作者
邓钰
雷航
李晓瑜
林奕欧
DENG Yu;LEI Hang;LI Xiao-yu;LIN Yi-ou(School of Information and Software Engineering,University of Electronic Science and Technology of China,Chengdu 610054)
出处
《电子科技大学学报》
EI
CAS
CSCD
北大核心
2019年第5期759-766,共8页
Journal of University of Electronic Science and Technology of China
基金
国家自然科学基金(61502082)
中央高校基本科研业务费(ZYGX2014J065)
关键词
目标情感分类
注意力机制
卷积神经网络
深度学习
自然语言处理
aspect-based sentiment categorization
attention mechanism
convolutional neural network
deep learning
natural language processing