期刊文献+

基于混合特征网络的学生评教文本情感分析模型

Sentiment Analysis Model of Students' Teaching Evaluation Text Based on Hybrid Feature Network
下载PDF
导出
摘要 以学生评教文本情感分析任务作为切入点,针对传统基础深度学习模型特征提取能力不足、循环神经网络训练效率较低以及词向量语义表示不准确等问题,提出基于混合特征网络的学生评教文本情感分类算法。采用轻量级ALBERT预训练模型提取符合当前上下文语境的每个词的动态向量表示,解决传统词向量模型存在的一词多义问题,增强向量语义表示的准确性;混合特征网络通过结合简单循环单元和多尺度局部卷积学习模块以及自注意力层,全面捕捉评教文本全局上下文序列特征和不同尺度下的局部语义信息,提升模型的深层次特征表示能力,自注意力机制通过计算每个分类特征对分类结果的重要程度,识别出对情感识别结果影响较大的关键特征,避免无关特征对结果造成干扰,影响分类性能,将分类向量拼接后由线性层输出评教文本情感分类结果。在真实学生评教文本数据集上的实验结果表明,该模型F1值达到97.8%,高于对比的BERT-BiLSTM、BERT-GRU-ATT等深度学习模型。此外,消融实验结果也证明了各模块的有效性。 Taking the sentiment analysis task of students'teaching evaluation text as the starting point,in view of the insufficient feature-extraction ability of the traditional basic depth learning model,the low training efficiency of the recurrent neural network,and the inaccurate semantic representation of word vectors,a sentiment classification algorithm for student evaluation text based on a hybrid feature network is proposed.The lightweight pre-training model ALBERT is used to extract the dynamic vector representation of each word that conforms to the current context,solve the problem of polysemy in the traditional word vector model,and increase the accuracy of vector semantic representation.The hybrid feature network comprehensively captures the global context sequence features of the teaching evaluation text and the local semantic information at different scales by combining the simple recurrent unit,multi-scale local convolution learning module,and self-attention layer,to improve the deep feature representation ability of the model.The self-attention mechanism identifies the key features that significantly impact the emotional recognition results by calculating the importance of each classification feature to the classification results.To prevent irrelevant features from interfering with the results and affecting the classification performance,the classification vectors are spliced,and the emotional classification results of the evaluation text are output from the linear layer.In an experiment based on a real student teaching evaluation text dataset,the model achieves an F1 score of 97.8%,which is higher than that of the BERT-BiLSTM、BERT-GRU-ATT depth learning model.Additionally,an ablation experiment proves the effectiveness of each module.
作者 吴奇林 党亚固 熊山威 吉旭 毕可鑫 WU Qilin;DANG Yagu;XIONG Shanwei;JI Xu;BI Kexin(School of Chemical Engineering,Sichuan University,Chengdu 610041,China)
出处 《计算机工程》 CAS CSCD 北大核心 2023年第11期24-29,39,共7页 Computer Engineering
基金 国家重点研发计划(2021YFB40005)。
关键词 情感分析 预训练模型 自注意力 双向简单循环单元 多尺度卷积网络 sentiment analysis pre-training model self attention bidirectional simple recurrent unit multiscale convolution network
  • 相关文献

参考文献8

二级参考文献97

共引文献92

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部