摘要
情感分析是自然语言处理领域(NLP)中重要的语义处理任务,目前处理NLP任务的两大主流模型是卷积神经网络(CNN)和循环神经网络(RNN)以及他们的变体。由于自然语言在结构上存在依赖关系,且重要信息可能出现在句子的任何位置。RNN可能会忽略为了解决这些问题,我们提出了一种新的模型ABGC,将Attention机制加入到BiLSTM中,可以更好捕获句子中最重要的局部信息,同时融合添加GLU(非线性单元)的卷积神经网络(CNN),可以更好捕捉文本的全局信息,然后将两种模型提取到的特征融合,既有效避免了LSTM的梯度消失问题,又解决了CNN忽略上下文语义的问题。我们在两种数据集上进行对比实验,实验结果表明ABGC模型可以有效提高文本分类准确率,同时减少运行时间。
Sentiment analysis is an important semantic processing task in the field of natural language processing (NLP), The two mainstream models currently dealing with NLP tasks are Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN) and their variants. Because natural language has structural dependencies, important information may appear anywhere in the sentence. RNN may ignore these informations. In order to solve these problems, We propose a new model ABGC, the model adds the Attention mechanism to BiLSTM, which can capture the most important local information in the sentence better, and fuse the convolutional neural network (CNN) with GLU (non-linear unit). It is good to capture the global information of the text, and then fuse the features extracted by the two models, which not only avoids the LSTM gradient disappearance problem, but also solves the problem that CNN ignores the context semantics. We conducted comparative experiments on two data sets. The experimental results show that the ABGC model can effectively improve the accuracy of text classification and reduce the running time.
作者
孙承爱
丁宇
田刚
SUN Cheng-ai;DING Yu;TIAN Gang(College of Computer Science and Engineering,Shandong University of Science and Technology,Qingdao 266000,China)
出处
《软件》
2019年第7期62-66,共5页
Software
基金
国家自然科学基金青年项目(No.61602279)
国家自然科学基金青年项目(No.02030063802)