期刊文献+

Multi-layer network embedding on scc-based network with motif

下载PDF
导出
摘要 Interconnection of all things challenges the traditional communication methods,and Semantic Communication and Computing(SCC)will become new solutions.It is a challenging task to accurately detect,extract,and represent semantic information in the research of SCC-based networks.In previous research,researchers usually use convolution to extract the feature information of a graph and perform the corresponding task of node classification.However,the content of semantic information is quite complex.Although graph convolutional neural networks provide an effective solution for node classification tasks,due to their limitations in representing multiple relational patterns and not recognizing and analyzing higher-order local structures,the extracted feature information is subject to varying degrees of loss.Therefore,this paper extends from a single-layer topology network to a multi-layer heterogeneous topology network.The Bidirectional Encoder Representations from Transformers(BERT)training word vector is introduced to extract the semantic features in the network,and the existing graph neural network is improved by combining the higher-order local feature module of the network model representation network.A multi-layer network embedding algorithm on SCC-based networks with motifs is proposed to complete the task of end-to-end node classification.We verify the effectiveness of the algorithm on a real multi-layer heterogeneous network.
出处 《Digital Communications and Networks》 SCIE CSCD 2024年第3期546-556,共11页 数字通信与网络(英文版)
基金 supported by National Natural Science Foundation of China(62101088,61801076,61971336) Natural Science Foundation of Liaoning Province(2022-MS-157,2023-MS-108) Key Laboratory of Big Data Intelligent Computing Funds for Chongqing University of Posts and Telecommunications(BDIC-2023-A-003) Fundamental Research Funds for the Central Universities(3132022230).
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部