期刊文献+

多状态图神经网络文本分类算法 被引量:1

Multi-state graph neural network for text classification
下载PDF
导出
摘要 为了提高模型在文本分类任务中的分类性能,针对图神经网络中存在的过度平滑问题,同时增强模型在处理文本特征与文本表示方面的能力,提出了一种基于多状态图神经网络的文本分类算法(multi-state graph neural network,MSGNN)。多状态图神经网络是利用网络层的多个历史状态信息对图神经网络进行强化,构建合理的文本图结构数据作为模型输入。在缓解网络层过度平滑问题的同时,结合2种改进后的不同类型的图神经网络来增强模型的特征提取与特征聚合能力。利用多头自注意力机制对文本关键词的挖掘与利用能力,从多个文本子空间来生成高质量的文本表示,进而完成文本分类。通过在几个公开的文本分类数据集上进行实验分析,相较于其他神经网络的文本分类算法,该方法取得了较好的分类准确率。 To improve the classification performance of the model in the text classification task,aiming at the problem of over-smoothing in the graph neural network and enhancing the ability of the model in processing text features and text representation,we propose a text classification algorithm based on multi-state graph neural network(MSGNN).MSGNN uses multiple historical state information of the network layer to strengthen the graph neural network and constructs reasonable text graph structure data as model input.While the over-smoothing problem of the network layer is being alleviated,two improved graph neural networks of different types are combined to enhance the feature extraction and feature aggregation capabilities of the model.The multi-head self-attention mechanism is used to mine and utilize text keywords.High-quality text representations are generated from multiple text subspaces,and then text classification is completed.Through experimental analysis of several public text classification datasets,the proposed algorithm achieves better classification accuracy than other text classification algorithms based on neural networks.
作者 王进 陈重元 邓欣 孙开伟 WANG Jin;CHEN Chongyuan;DENG Xin;SUN Kaiwei(Key Laboratory of Data Engineering and Visual Computing,Chongqing University of Posts and Telecommunications,Chongqing 400065,P.R.China)
出处 《重庆邮电大学学报(自然科学版)》 CSCD 北大核心 2023年第2期193-201,共9页 Journal of Chongqing University of Posts and Telecommunications(Natural Science Edition)
基金 国家重点研发计划专项(SQ2021YFE010559)。
关键词 自然语言处理 文本分类 图神经网络 注意力机制 natural language processing text classification graph neural network attention mechanism
  • 相关文献

参考文献1

二级参考文献2

共引文献13

同被引文献5

引证文献1

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部