摘要
挖掘分析评论文本的情感倾向成为近年来自然语言处理领域的研究热点之一。本文以挖掘京东商城商品评论数据价值为研究视角,以深度学习中的循环神经网络为理论基础,将循环神经网络的各变体模型应用到文本情感分类任务中,对比不同改进模型的评论文本分类效果。本文首先研究了循环神经网络的变体模型长短期记忆模型LSTM、门控循环单元模型GRU在京东商品评论文本上的分类效果。实验表明,GRU模型在训练过程中的准确率更高且更早达到优化值,总体上GRU网络模型在文本分类上的效果优于LSTM网络模型。其次研究了以情感词驱动的、基于循环神经网络各变体模型的注意力神经网络模型,将各深度神经网络模型与注意力机制相结合,对比分析各组合模型的情感分类效果。实验表明,引入注意力机制的神经网络模型,较传统网络模型分类准确率都有所提升,且会更快地达到优化值。
Mining the emotional tendency of analyzing comment texts has become one of the research hotspots in the field of natural language processing in recent years.This paper takes the research perspective of mining the value of commodity comment data in Jingdong Mall,takes the recurrent neural network in deep learning as the theoretical basis,applies the variant models of the recurrent neural network to the text emotion classification task,and compares the comment text classification effect of different improved models.This paper first studies the classification effect of the long and short-term memory model LSTM and the gating loop unit model GRU of the recurrent neural network on the JD commodity review text.The Experiments show that the GRU model has higher accuracy and reaches the optimization value earlier during training,and the GRU network model is better than the LSTM network model in text classification.Secondly,the attention neural network model driven by emotion words and based on each variant model of the recurrent neural network was studied,combining each deep neural network model with the attention mechanism,and the emotion classification effect of each combined model was compared and analyzed.The experiment shows that the neural network model introducing the attention mechanism can improve the classification accuracy of the traditional network model,and will reach the optimization value faster.
作者
计文丽
Ji Wenli(Geely University of China,Chengdu,China)
出处
《科学技术创新》
2024年第3期100-105,共6页
Scientific and Technological Innovation
关键词
情感分类
循环神经网络
长短期记忆
门控循环单元
注意力机制
sentiment classification
recurrent neural network
long short-term memory
gated recurrent unit
attention mechanism