期刊文献+

基于深度学习的网络评论文本情感分析方法

A Deep Learning-Based Text Sentiment Analysis Method for Web Comments
原文传递
导出
摘要 在自然语言处理的众多研究领域中,文本的情感层面分析已成为一个备受瞩目的课题。针对情感分析任务中存在的文本向量表示语义不佳和特征提取不足导致分类不准确的问题,本研究提出了一种融合了RoBERTa-wwm-ext模型和多头注意力机制的深度学习文本分类框架RoBERTa-BiLSTM-Mutil-Head-Attention(RBM)。模型最初利用预训练的RoBERTa-wwm-ext语言模型捕获文本的动态特性;利用双向长短期记忆网络Bi-LSTM进一步提取文本更深层次的语义关系,将最后一个时序输出作为特征向量输入到多头注意力机制层;最后通过全连接层神经网络得到文本分类结果。经过一系列模型的对比测试,本研究提出的基于RBM的分类模型在ChnSentiCorp的网络评论文本集上实现了更高的准确度、精确率、召回率和F1值,且模型较好地提取了文本中字词的特征,提高了中文评论文本情感分析的效果。 Among the many research areas in natural language processing,sentiment analysis of text has become a highly focused topic.To address issues with poor semantic representation and insufficient feature extraction in sentiment analysis tasks,which lead to inaccurate classification,this study proposes a deep learning text classification framework called RoBERTa-BiLSTM-Multi-Head-Attention(RBM),which integrates the RoBERTa-wwm-ext model with a multi-head attention mechanism.The model initially utilizes the pre-trained RoBERTa-wwm-ext language model to capture the dynamic characteristics of the text;it then uses a bidirectional long short-term memory network(Bi-LSTM)to further extract deeper semantic relationships from the text,with the final time-step output being used as a feature vector input into the multi-head attention mechanism layer;finally,a fully connected neural network layer yields the text classification results.Through a series of comparative tests,the RBMbased classification model proposed in this study achieved higher accuracy,precision,recall,and F1 scores on the ChnSentiCorp network comment text dataset,and the model effectively extracted word features from the text,improving the effectiveness of sentiment analysis on Chinese review texts.
作者 李大一 王友国 翟其清 Dayi Li;Youguo Wang;Qiqing Zhai(School of Science,Nanjing University of Posts and Telecommunications,Nanjing Jiangsu)
出处 《建模与仿真》 2024年第5期5372-5381,共10页 Modeling and Simulation
基金 国家自然科学基金项目(62071248)、国家自然科学基金项目(62201284)。
关键词 文本情感分析 RoBERTa-wwm-ext Bi-LSTM Mutil-Head-Attention Text Emotion Analysis RoBERTa-wwm-ext Bidirectional LSTM Mutil-Head-Attention
  • 相关文献

参考文献8

二级参考文献85

共引文献149

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部