摘要
深层预训练模型能有效应用于"方面-目标"类型的方面级情感分析,但其结构复杂,计算代价昂贵,不能直接应用于"方面-主题"类型的方面级情感分析。基于此,本文提出了一种改进的浅层预训练模型(JWT),它能同时对中心词的局部上下文和全局上下文建模。局部上下文建模沿用word2vec的思想,全局上下文建模利用vMF分布。JWT将全局上下文视为主题,并将其作为中心词局部上下文的产生条件,能适应"方面-主题"类型的方面级情感分析应用场景。在3个数据集上评估了JWT模型学得的词相似性,在4种不同的情感分类器上研究了JWT在评论数据集SemEval ABSA上的情感分类性能。结果表明:JWT模型在所有的实验任务上均优于标准的skip-gram,并取得了与现有的基准模型(cvMF)和Joint skip-gram相媲美的效果。
The deep pre-trained model can be effectively applied to aspect-target sentiment analysis.However,it cannot be directly applied to aspect-topic sentiment analysis due to its complex structure and high computational cost.To solve this problem,this paper proposes an improved shallow pretrained model(JWT),which can model the local context and global context of the center word at the same time.The local context modeling follows the idea of word2 vec,and the global context modeling uses vMF distribution.JWT regards the global context as a topic and takes it as the generating condition of the local context of the center word.Therefore,the proposed model is suitable for the application scenarios of aspect-topic sentiment analysis.We evaluated the word similarity for JWT model on three different datasets.Through constructing four different sentiment classifiers,we studied the application of JWT model in the semEval ABSA reviews.The results show that JWT model outperforms the standard skip-gram in all experimental tasks,and achieves comparable results with the existing benchmark models(cvMF)and joint skip-gram.
作者
费宏慧
FEI Honghui(School of Electronics and Information Engineering,Shanghai Dianji University,Shanghai 201306,China)
出处
《中国工程机械学报》
北大核心
2021年第3期212-216,共5页
Chinese Journal of Construction Machinery
基金
国家自然科学基金资助项目(61702320)
上海电机学院计算机科学与技术优势学科资助项目(16YSXK04)。