期刊文献+

基于相关主题模型和多层知识表示的文本情感分析 被引量:3

Text Sentiment Analysis Based on Correlated Topic Model and Multi-layer Knowledge Representation
下载PDF
导出
摘要 将相关主题模型和多层知识表示方法相结合开展文本情感分析研究。首先,针对传统分割算法的不足和主题间相关关系,采用相关主题模型对文本进行主题特征分割,构造主题先验信息输入预训练语言模型;其次,基于主题先验信息和相关关系向量,采用预训练的语言模型嵌入进行文本词的动态表示,能有效解决一词多义的问题;最后,使用双向长短期记忆模型对文本句子进行表示,考虑每个词的前后信息来捕捉句子的位置信息,在句子表示向量的信息抽取中融入注意力机制,使用多头抽取考虑全局的方式,可以抽取更全面的文本信息。 Correlated topic model and multi-layer knowledge representation method were integrated to carry out text sentiment analysis.Firstly,considering the shortcomings of traditional segmentation algorithms and topic correlation,the correlated topic model was used to segment the text features,and topic prior information was built as the input of pre-trained language model.Secondly,the pre-trained embeddings from language model was used to dynamically express the words of texts based on topic prior information and correlation vector,which could effectively solve the polysemy problem.Finally,bidirectional long short-term memory model was adopted to represent the sentences of text,and the information before and after each word was considered to capture sentence position.Attention mechanism was integrated into information extraction of sentence representation vector,which used multi-head extraction method to extract more comprehensive information of texts.
作者 马长林 王涛 MA Changlin;WANG Tao(Hubei Provincial Key Laboratory of Artificial Intelligence and Smart Learning, Central China Normal University, Wuhan 430079, China;School of Computer, Central China Normal University, Wuhan 430079, China;National Language Resources Monitoring & Research Center for Network Media, Wuhan 430079, China)
出处 《郑州大学学报(理学版)》 北大核心 2021年第4期30-35,共6页 Journal of Zhengzhou University:Natural Science Edition
基金 国家自然科学基金项目(61772224)。
关键词 相关主题模型 多层知识表示 深度学习 文本分割 注意力机制 correlated topic model multi-layer knowledge representation deep learning text segmentation attention mechanism
  • 相关文献

参考文献3

二级参考文献21

共引文献16

同被引文献20

引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部