期刊文献+

用于非精确图匹配的改进注意图卷积网络 被引量:5

Improved Attention Graph Convolutional Network Model for Inexact Graph Matching
下载PDF
导出
摘要 将传统图卷积网络模型应用于非精确图匹配时,在卷积步骤早期易存在节点特性以及节点之间拓扑特征的损失,从而影响导致匹配性能.针对这一问题,提出了改进注意图卷积网络模型.使用相对较少的参数以端到端的方式学习分层表示,利用自注意机制来区分应该丢弃或保留的节点.首先利用注意图卷积网络来自动学习不同跳上邻域的重要程度;其次,加入自注意池化层,从矩阵图嵌入的各个方面概括图表示;最后,在多个标准图数据集中进行训练和测试.实验结果表明,相较于目前最先进的图核和其他深度学习算法,该方法在标准图数据集上实现了更优的图分类性能. When the traditional graph convolutional netw ork model is applied to inexact graph matching,the node characteristics and the loss of topological characteristics betw een nodes are prone to exist in the early stage of the convolution step,thereby affecting the matching performance.To solve this problem,an improved attention graph convolution netw ork model is proposed.Relatively few parameters are used to learn the hierarchical representation in an end-to-end manner,and a self-attention mechanism is used to distinguish nodes that should be discarded or retained.First,the attention graph convolution netw ork is used to automatically learn the importance of different hops on the neighborhood.Second,the self-attention pooling layer is added to summarize the graph representation from all aspects of the matrix graph embedding.Finally,training and testing are performed on multiple standard graph data sets.Experimental results show that this method achieves better graph classification performance on the standard graph data set than the most advanced graph kernel and other deep learning algorithms.
作者 李昌华 刘艺 李智杰 LI Chang-hua;LIU Yi;LI Zhi-jie(College of Information and Control Engineering,Xi'an University of Architecture and Technology,Xi'an 710055,China)
出处 《小型微型计算机系统》 CSCD 北大核心 2021年第1期41-45,共5页 Journal of Chinese Computer Systems
基金 国家自然科学基金项目(61373112,51878536)资助 陕西省自然科学基金项目(2020JQ-687)资助。
关键词 节点邻域 图形拓扑 图匹配 自注意图卷积网络 自注意图池化 node neighborhood graph topology graph matching attention graph convolutional network self-attention graph pooling
  • 相关文献

参考文献1

二级参考文献4

共引文献21

同被引文献26

引证文献5

二级引证文献8

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部