摘要
针对图神经网络模型学习图嵌入节点表示过程中易丢失大量特征信息及其图拓扑保留不完整的问题,提出了一种改进的图注意力(graph attention)机制模型。该模型分为节点级双向注意力机制和图级自注意图池化两部分。在学习图节点新的特征向量表示过程中,采取计算双向图注意力权重的方式,为邻域节点的保留提供可靠选择的同时增强节点间的相似属性;在图的整体拓扑上结合自注意图池,使用节点特征向量作为输入,通过注意卷积层提供的自注意权重在池化层生成图嵌入表示;在Cora、Citeseer、Pubmed数据集上进行了测试,实验结果表明:相比于基准图注意力机制模型,改进模型能够充分考虑图的局部和整体结构特征,有效增强模型聚合邻域信息的能力,减少了图嵌入过程中原始特征的丢失,明显提升了模型在下游任务的表现性能。
Addressing the problem of easy loss of large amounts of feature information and the incompletely preserved graph topology in the process of representation of embedded nodes in learning graph in the graph neural network model,an improved graph attention mechanism model is proposed.The model is divided into the node-level bidirectional atten-tion mechanism and the graph-level self-attention graph pooling.During the representation of new feature vectors of the learning graph nodes,calculations of the bidirectional graph attention weight are adopted to provide reliable choices for the retention of neighboring nodes while enhancing similar properties between nodes.A self-attention graph pooling is combined with the overall topology of the graph,using node feature vectors as input to generate a graph embedding repre-sentation at the pooling layer through the self-attention weights provided by the attention convolution layer.It is tested on Cora,Citeseer,and Pubmed datasets.According to the experimental results,compared with the baseline graph atten-tion mechanism model,the improved model takes full account of the local and global structural features of the graph.It effectively enhances the ability of the model to aggregate neighborhood information,reduces the loss of original features in graph embedding,and significantly improves the performance of the model for downstream tasks.
作者
李智杰
韩津津
李昌华
张颉
LI Zhijie;HAN Jinjin;LI Changhua;ZHANG Jie(College of Information and Control Engineering,Xi’an University of Architectural Science and Technology,Xi’an 710055,China)
出处
《计算机工程与应用》
CSCD
北大核心
2023年第17期152-158,共7页
Computer Engineering and Applications
基金
国家自然科学基金(61373112,51878536)
陕西省自然科学基金(2020JQ-687)
陕西省住房城乡建设科技计划项目(2020-K09)。
关键词
图嵌入
双向图注意力机制
自注意图池
特征表示
图拓扑
graph embedding
bidirectional graph attention mechanism
self-attention graph pool
feature representation
graph topology