摘要
目的:利用深度学习新技术图神经网络,构建结合信息保留的多头注意力图池化模型以解决图分类问题。方法:首先,设计信息保留模块,保留被丢弃节点中的有效信息。其次,采用多头注意力机制,多次计算中心节点与邻居节点的相关度,更加全面地学习节点重要性得分。之后,将池化图中的孤立点与邻居节点相连,以保证图结构的连通性。最后,利用读出操作将每一层输出传至分类器,完成图分类。结果:所提出的方法在7个基准数据集上进行图分类实验,在5个数据集上的分类性能优于其他方法。结论:结合信息保留的多头注意力图池化模型在图分类任务中具有优越性。
Aims: The new deep learning technology graph neural networks were used to build a multi-head attention graph pooling model with information retention to solve the graph classification problem. Methods: Firstly, the information retention module was designed to retain effective information in the discarded nodes. Secondly, the multi-head attention mechanism was used to calculate the correlation between the central node and the neighbor node many times so as to learn the node importance score more comprehensively. After that, the isolated nodes in the pooling graph were connected with the neighbor nodes to ensure the connectivity of the graph structure. Finally, the output of each layer was sent to the classifier using the readout operation to solve the problem of graph classification. Results: The proposed method performed graph classification experiments on seven benchmark datasets;and the classification performance on five datasets was better than other methods. Conclusions: The multi-head attention graph pooling model with information retention has superiority in graph classification tasks.
作者
顾昕
叶海良
杨冰
曹飞龙
GU Xin;YE Hailiang;YANG Bing;CAO Feilong(Colege of Sciences,China Jiliang University,Hangzhou 310018,China)
出处
《中国计量大学学报》
2022年第2期288-296,共9页
Journal of China University of Metrology
基金
国家自然科学基金项目(No.62032022)。
关键词
图神经网络
池化
信息保留
多头注意力
graph neural network
pooling
information retention
multi-head attention