期刊文献+

基于改进BERT的景点评论情感分析 被引量:1

Sentiment Analysis of Scenic Spot Reviews Based on Improved BERT
下载PDF
导出
摘要 针对景点推荐领域评论的情感信息未得到充分利用的现状,文章融合残差网络和双向长短期记忆网络提出了一种改进的BERT模型。深度学习模型BERT以无监督的学习方式,通过后续的任务来完成推理和判断,而在语义层面忽略特定领域知识。该文研究多头残差网络(Multi-head residual network,MRN),可以从多个层面上学习情绪特征,在补充领域知识的同时避免深度神经网络退化,利用BERT预训练模型提供语义特征,动态地将语句的情感嵌入到模型中,并将各层次模型输出的特征与双向长短期记忆网络的输出数据进行特征融合,得到最终的文本情感。该模型对比BERT预训练模型、BERT-BiLSTM模型、BiLSTM模型和卷积神经网络CNN的实验结果,最后得到的F1值和准确率都有显著提高。由此认为,BERT-MRN模型的情感极性分析能力有较好的提升,可以更好地分析景点评论的情感信息。 An improved BERT model(BERT-MRN)is proposed by combining residual network and bidirectional long short-term memory network,Aiming at the underutilization of sentiment information in the field of scenic spot recommendation.The deep learning model BERT itself is an unsupervised learning method.It needs to complete inference and judgment through subsequent tasks,so it will encounter some problems in the learning process.The paper studies the Multi-head residual network,which can learn emotional features from multiple levels to supplement domain knowledge,provide semantic features using BERT pre-training mode,and dynamically based on BERT’s learning ability.Embed the feature space of the emotion of sentences into the model,and fuse the features output from the models at each level with the output data of the bidirectional long short-term memory network to obtain the final text emotional.Compared with the experimental results of BERT pre-training model,Bert-BilSTm model,BILSTM model and convolutional neural network CNN,the F1 value and accuracy were significantly improved.Therefore,it is concluded that the emotional polarity analysis ability of BERT-MRN model has been improved,which can better analyze the emotional information of scenic spot comments.
作者 刘宇泽 叶青 刘建平 LIU Yuze;YE Qing;LIU Jianping(School of Electrical and Information Engineering,Changsha University of Science and Technology,Changsha 410114,China)
出处 《传感器世界》 2022年第12期24-29,I0003,共7页 Sensor World
关键词 情感分析 BERT 残差网络 双向长短期记忆网络 sentiment analysis BERT residual network BiLSTM
  • 相关文献

参考文献2

二级参考文献13

共引文献50

同被引文献4

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部