期刊文献+

基于矩阵分解的属性网络表示学习 被引量:1

Attributed Network Representation Learning Based on Matrix Factorization
下载PDF
导出
摘要 为融合网络拓扑结构与节点属性信息以提高网络表示学习质量,提出一种新的属性网络表示学习算法(ANEMF)。引入余弦相似性概念,定义网络二阶结构相似度矩阵和属性相似度矩阵,通过对网络结构相似度和属性相似度损失函数进行联合优化学习,并利用矩阵分解的形式实现网络拓扑结构与节点属性信息的融合,同时应用乘法更新规则计算得到节点表示向量。在3个公开数据集上的实验结果表明,与DeepWalk和TADW算法相比,ANEMF算法得到的节点表示向量能够保留网络拓扑结构与节点属性信息,有效提升其在节点分类任务中的综合性能。 To combine the information of network topological structure and node attribute to improve the quality of network representation learning,this paper proposes a new attributed network representation learning algorithm,named ANEMF.The algorithm introduces the idea of cosine similarity to define the second-order structural similarity matrix and the attribute similarity matrix of the network.Through the cooperative optimized learning of network structure similarity and attribute similarity functions,the information of network topological structure and node attribute is fused in the form of matrix factorization.Finally,the node representation vectors are obtained through the multiplication update rules.Experimental results on three public datasets show that compared with DeepWalk and TADW algorithms,the proposed algorithm can keep the information of network topological structure and node attribute in obtained node representation vectors.It can significantly improve the overall performance in the node classification tasks.
作者 张潘 卢光跃 吕少卿 赵雪莉 ZHANG Pan;LU Guangyue;LV Shaoqing;ZHAO Xueli(School of Communications and Information Engineering,Xi’an University of Posts and Telecommunications,Xi’an 710121,China;Shaanxi Provincial Key Laboratory of Information Communication Network and Security,Xi’an 710121,China)
出处 《计算机工程》 CAS CSCD 北大核心 2020年第10期67-73,共7页 Computer Engineering
基金 陕西省教育厅科研计划项目(17JK0703)。
关键词 机器学习 网络分析 数据挖掘 网络表示学习 矩阵分解 网络嵌入 machine learning network analysis data mining network representation learning matrix factorization network embedding
  • 相关文献

参考文献4

二级参考文献7

共引文献111

同被引文献7

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部