摘要
为了解决当前情感分类方法对于文本信息利用不充分并且缺乏对用户偏好的考虑从而导致情感分类准确率不高的问题,论文引入注意力机制来处理多元文本,并利用SRNN模型来充分地提取文本的隐藏特征,提出了一种融合多元文本信息和注意力机制的方面级情感分类方法。该方法以电商平台为研究对象,综合利用商品简介文本和用户评论文本,首先利用注意力机制使两种文本信息互相作用,得到融合了多元文本的表示向量;然后分别在正向和反向上进行处理以充分地提取文本的隐藏特征;最后对评论信息中涉及的不同方面分别以对应的方面处理模块进行训练,根据用户偏好得到其最感兴趣的方面,将特征向量输入该方面处理模块中,进行方面级情感极性计算,最终得到情感分类结果。论文在豆瓣数据集上进行了对比实验,实验结果表明,论文所提方法在准确率和F1值上相较于当前主流的基于LSTM、CNN的方法都有明显提升。
In order to solve the problem that the current sentiment classification methods do not fully utilize text information and lack consideration of user preferences,resulting in low sentiment classification accuracy,this paper introduces an attention mechanism to deal with multiple texts,and uses the SRNN model to fully extract Based on the hidden features of text,an aspect-lev-el sentiment classification method that fuses multiple textual information and attention mechanisms is proposed.This method takes the e-commerce platform as the research object,comprehensively uses the product introduction text and user comment text,firstly uses the attention mechanism to interact with the two text information,and obtains the representation vector that integrates multiple texts.The information is processed to fully extract the hidden features of the text.Finally,the different aspects involved in the com-ment information are trained with the corresponding aspect processing module,and the most interesting aspect is obtained according to the user's preference,and the feature vector is inputted into the aspect processing module,perform aspect-level sentiment polari-ty calculation,and finally obtain sentiment classification results.Compared with the current mainstream methods based on LSTM and CNN,the method proposed in this paper is significantly improved in accuracy and F1 value.
作者
冯勇
徐健航
王嵘冰
徐红艳
FENG Yong;XU Jianhang;WANG Rongbing;XU Hongyan(College of Information,Liaoning University,Shenyang 110036)
出处
《计算机与数字工程》
2024年第3期903-908,共6页
Computer & Digital Engineering
基金
辽宁省社会科学规划基金项目(编号:L21BGL026)资助。
关键词
情感分类
方面级
多元文本
注意力机制
SRNN
sentiment classification
aspect-level
multiple textual information
attention mechanism
SRNN