期刊文献+

一种改进的融合文本主题特征的情感分析模型 被引量:3

An Improved Sentiment Analysis Model Incorporating Textual Topic Features
下载PDF
导出
摘要 【目的】海量的用户评论对消费者和相关企业具有很大价值,针对评论信息长度过短导致的数据稀疏,主题不明确及分类准确率不高等问题。【方法】本文提出了一种融合主题特征的Bi-LSTM自注意力机制在线评论情感分析模型(TSC-BiLSTM)。与传统LSTM方法相比,该方法利用潜在狄利克雷分布(LDA)主题模型获得评论的主题词分布,与评论词向量拼接作为输入,通过Bi-LSTM挖掘全文特征信息,结合self-attention机制动态分配权重。【结果】本模型扩充了原短评论文本的特征空间,降低了数据的稀疏性,明确主题且提高了情感分类的准确性。【结论】在酒店和某外卖平台评论数据集上的实验表明,与相关模型比较,所提出的方法具有更好的性能,为主题情感分析方法提供了一种新的思路。 [Objective]Massive user reviews are of great value to consumers and related enterprises.This paper addresses the problems of sparse data,unclear topics,and poor classification accuracy caused by the short length of review information.[Methods]This paper proposes a Bi-LSTM self-attention mechanism online review sentiment analysis model(TSC-BiLSTM)incorporating topic features.Compared with the traditional LSTM method,this method uses the Latent Dirichlet Allocation(LDA)topic model to obtain the topic word distribution of comments,stitches it with the comment word vector as input,mines the full-text feature information through Bi-LSTM,and combines with selfattention mechanism to dynamically assign weights.[Results]This model expands the feature space of the original short review text,reduces the sparsity of the data,clarifies the topic,and improves the accuracy of sentiment classification.[Conclusions]Experiments on review datasets of a hotel and a takeaway platform show that the proposed method achieves better performances compared with other related models.It provides a novel view of topic sentiment analysis methods.
作者 张帅 黄勃 巨家骥 ZHANG Shuai;HUANG Bo;JU Jiaji(School of Electrical and Electronic Engineering,Shanghai University of Engineering Science,Shanghai 201620,China)
出处 《数据与计算发展前沿》 CSCD 2022年第6期118-128,共11页 Frontiers of Data & Computing
基金 国家重点研发计划(2020AAA0109300) 上海市信息安全综合管理技术重点实验室开放项目(AGK2019004)。
关键词 LDA 主题词 Bi-LSTM self-attention 情感分析 LDA topic words Bi-LSTM self-attention sentiment analysis
  • 相关文献

参考文献7

二级参考文献31

共引文献139

同被引文献44

引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部