期刊文献+

基于文本图神经网络的小样本文本分类技术研究

Research on few-shot text classification techniques based on text-level-graph neural networks
下载PDF
导出
摘要 为了解决文本图神经网络小样本文本分类精度较差的问题,设计了基于文本图神经网络的原型网络,采用预训练语言模型,利用文本级图神经网络为每个输入文本构建图并共享全局参数,将文本图神经网络的结果作为原型网络的输入,对未标注文本进行分类,并验证新模型在多个文本分类数据集上的有效性。实验结果表明,与需要大量标注文档的监督学习方法相比,所采用的方法未标注文本的分类精度提高了1%~3%,在多个文本分类数据集上验证了新模型性能先进,内存占用更少。研究结果可为解决小样本文本分类问题提供参考。 In order to solve the problem of poor accuracy of text classification in text graph neural network with small samples,a text level graph neural network-prototypical(LGNN-Proto) was designed.An advanced pre-training language model was adopted,and the text graph neural network was used to construct the graph for each input text,then the global parameters were shared.The result of the text graph neural network was used as the input of the prototype network to classify the unlabeled text,and the validity of the new model on multiple text classification data sets was verified.The results show that the accuracy of unlabeled text classification is improved by 1% ~ 3% compared with that of supervised learning,which requires a large number of labeled documents,and the new model is validated on multiple text classification data sets with advanced performance and lower memory consumption.The research results can provide reference for solving the problem of text classification with small sample size.
作者 安相成 刘保柱 甘精伟 AN Xiangcheng;LIU Baozhu;GAN Jingwei(The 54th Research Institute of CETC,Shijiazhuang,Hebei 050051,China)
出处 《河北科技大学学报》 CAS 北大核心 2024年第1期52-58,共7页 Journal of Hebei University of Science and Technology
基金 河北省智能化信息感知与处理重点实验室发展基金(SXX22138X002) LZH联合QB数据融合与共享服务项目。
关键词 自然语言处理 小样本文本分类 预训练模型 图神经网络 原型网络 natural language processing few-shot text classification pre-trained model graph neural network prototype network
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部