期刊文献+

基于BERT的BiGRU-Attention-CNN混合模型的中文情感分析

Chinese Sentiment Analysis Based on BiGRU-Atttention-CNN Hybrid Model of BERT
下载PDF
导出
摘要 在词嵌入层面上,中文情感分析一般是采用one-hot编码或Word2Vec方法生成词向量表征,不能很好解决一词多义的问题;在特征提取的层面上,传统深度学习模型缺少对重要特征的重点关注。针对该问题,提出一种基于BERT的BiGRU-Attention-CNN混合神经网络模型的中文情感分析方法。BERT模型能产生丰富的动态词向量,结合BiGRU对上下文的长期依赖能力和CNN的特征提取能力,并融入Attention机制分配不同的权重值重点关注。在酒店评论、外卖评论、网购评论、微博评论四种公开中文数据集进行情感分类实验,实验结果表明,该模型相较于其它几种常见的模型,情感分类准确率有明显的提高。 At the word embedding level,Chinese sentiment analysis generally uses one-hot coding or Word2Vec methods to generate word vector representation,which can not well solve the problem of polysemy.At the feature extraction level,the traditional deep learning model lacks focus on important features.To address this problem,a Chinese sentiment analysis method based on BiGRU-Attention-CNN hybrid neural network model of BERT is proposed.The BERT model can generate rich dynamic word vectors,combining the long-term context-dependent capability of BiGRU and the feature extraction capability of CNN,and the Attention mechanism is integrated to assign different weight values to focus on.Sentiment classification experiments are conducted on four publicly available Chinese datasets,which are hotel comments,takeout comments,online shopping comments and microblog reviews.The experimental results show that the model has a significant improvement in sentiment classification accuracy compared with several other common models.
作者 邹旺 张吴波 ZOU Wang;ZHANG Wubo(School of Electrical and Information Engineering,Hubei University of Automotive Technology,Shiyan 442000)
出处 《计算机与数字工程》 2023年第10期2351-2357,共7页 Computer & Digital Engineering
基金 湖北省工业物联网环境下生产线的优化智能管理平台研究项目(编号:TA02002)资助。
关键词 词嵌入 情感分析 词向量 特征提取 权重 word embedding sentiment analysis word vector feature extraction weight
  • 相关文献

参考文献10

二级参考文献104

共引文献201

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部