期刊文献+

结合GCN和注意力机制的文本分类方法研究 被引量:1

Research of Text Categorization Method Combined with GCN and Attention Mechanism
下载PDF
导出
摘要 针对传统文本分类方法中需要手动提取特征和分类进而导致分类准确率不高的问题,提出一种结合图卷积神经网络和注意力机制的文本分类方法。方法首先建立整个语料库的大型文本图,然后将文本图的邻接矩阵和特征矩阵输入到图卷积神经网络中,最后网络输出与注意力机制相结合,利用注意力机制中Self-Attention机制的Query矩阵,Key矩阵和Value矩阵计算Attention值,充分学习文本表示,不断调整网络的输出,最终提高文本分类的准确率。在数据集上的仿真结果表明,所提出的方法与传统文本分类方法相比,其准确率较高。 Aiming at the problem that traditional Text Categorization methods need to manually extract features and classifications, which leads to low classification accuracy, a text categorization method combining graph convolutional neural network and attention mechanism is proposed. This method first built a large text graph of the entire corpus, and then input the adjacency matrix and feature matrix of the text graph into the graph convolutional neural network. Finally, the output of the network was combined with the attention mechanism, and the Attention value was calculated by using the Self-Attention mechanism’s Query matrix, Key matrix, and Value matrix, so as to fully learn the text representation, and constantly adjust the output of the network to ultimately improve the accuracy of text categorization. The experimental simulation results on the data set show that the proposed method has higher accuracy than the traditional text categorization method.
作者 申艳光 贾耀清 生龙 范永健 SHEN Yan-guang;JIA Yao-qing;SHENG Long;Fan Yong-jian(Department of Information and Electronic Engineering,Hebei University of Engineering,Handan Hebei 056038,China;Hebei Key Laboratory of Security&Protection Information Sensing and Processing,Hebei University of Engineering,Handan Hebei 056038,China)
出处 《计算机仿真》 北大核心 2021年第12期415-419,共5页 Computer Simulation
基金 国家重点研发计划项目(2018YFF0301004) 国家自然科学基金资助项目(61802107) 河北省高等学校科学技术重点研究基金项目(ZD2018087) 河北省自然科学基金资助项目(F2018402251) 河北省教育厅在读研究生创新能力培养资助项目(CXZZSS2020086)。
关键词 文本分类 图卷积神经网络 注意力机制 Text categorization GCN Attention mechanism
  • 相关文献

参考文献3

二级参考文献15

共引文献1729

同被引文献3

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部