摘要
近年来,基于教师-学生的知识蒸馏框架在图神经网络方面取得了较好的表现.然而这类知识蒸馏框架仍存在一些问题,如教师模型的知识信息不够全面,不能很好地指导学生模型;学生模型自身学习能力较差.为了解决这两方面的问题,本文提出了基于双通道知识蒸馏的节点分类方法.具体而言,该方法引入双教师模型,分别从拓扑结构和特征属性两个方面进行学习,保证了教师模型知识信息的多样性和全面性.学生模型采用参数化标签传播和邻居特征聚合两种预测机制,保证其具有更好的学习能力.最终,双教师模型分别从拓扑结构和特征属性两个方面对学生模型进行指导.在5个真实数据集上的实验结果表明该模型与最优基准模型相比具有更好的分类效果.
In recent years,the teacher-student knowledge distillation framework has achieved encouraging performance in graph neural network.However,there are still some problems in this kind of knowledge distillation framework.For example,the knowledge information of teacher model is not comprehensive enough to guide student model well and student model has poor learning ability.To solve the two issues,this paper presents a node classification method based on dual channel knowledge distillation.Specifically,the method introduces a dual teacher model which learns from two perspectives,including topological structure and feature attribute,in order to maintain the diversity and comprehensiveness of the knowledge information in teacher model.The student model employs the parameterized label propagation and feature aggregation of neighbors to ensure a better learning capability.Finally,the dual teacher model is leveraged to guide the student model from the two perspectives.Experimental results on five real word datasets show that this model has better classification performance than the best baseline model.
作者
王新生
朱小飞
黄贤英
WANG Xin-sheng;ZHU Xiao-fei;HUANG Xian-ying(College of Computer Science and Engineering,Chongqing University of Technology,Chongqing 400054,China)
出处
《小型微型计算机系统》
CSCD
北大核心
2023年第10期2284-2290,共7页
Journal of Chinese Computer Systems
基金
国家自然科学基金项目(62141201)资助
重庆市自然科学基金面上项目(CSTB2022NSCQ-MSX1672)资助
重庆市教育委员会科学技术研究计划重大项目(KJZD-M202201102)资助.
关键词
知识蒸馏
图神经网络
节点分类
标签传播
注意力机制
knowledge distillation
graph neural network
node classification
label propagation
attention mechanism