摘要
为了提高推荐算法的推荐性能,针对现有的图卷积神经网络(graph convolutional neural network,GCN)的推荐算法中,2-3层的传播网络结构不利于较远距离节点之间进行信息交互,而加深网络层数又会导致性能急剧下降的问题,提出一种特征规范化的图卷积神经网络推荐算法。该方法为传播网络中每一层输出特征进行规范化处理,避免节点嵌入表示随着网络层数增加而变得过于相似;在预测阶段,使用自注意力机制(self-attention mechanism,SA)将各层的输出进行连接,以获得更好的节点最终表示。在3个真实数据集上与传统算法以及现有同类型推荐算法进行对比,验证了该模型的有效性。实验结果表明,所提模型与基准模型相比,在召回率Recall@N和归一化折损累计增益NDCG@N上有明显提高,平均提升1.675%,最高可提升3.406%。
To improve the recommendation performance of the recommendation algorithm,a feature normalized graph convolutional neural network recommendation algorithm is proposed for existing graph convolutional neural network(GCN)recommendation algorithms,where the 2—3 layer propagation network structure is not conducive to information exchange between distant nodes,and deepening the number of network layers can lead to a sharp decrease in performance.This method normalizes the output features of each layer in the propagation network to avoid the node embedding representation becoming too similar as the number of network layers increases.In the prediction stage,the SA mechanism is used to connect the outputs of each layer to obtain a better final representation of the node.Finally,compared with excellent algorithms of the same type on three real data sets,the effectiveness of the model is verified.The experimental results show that compared with the benchmark model,the proposed model has a significant improvement in Recall@N and NDCG@N,with an average increase of 1.675%and a maximum increase of 3.406%.
作者
赵东琛
车文刚
高盛祥
ZHAO Dongchen;CHE Wengang;GAO Shengxiang(Faculty of Information Engineering and Automation,Kunming University of Science and Technology,Kunming 650500,P.R.China)
出处
《重庆邮电大学学报(自然科学版)》
CSCD
北大核心
2023年第3期528-535,共8页
Journal of Chongqing University of Posts and Telecommunications(Natural Science Edition)
基金
国家自然科学基金项目(61972186,U21B2027)
云南高新技术产业发展项目(201606)
云南省重大科技专项计划(202103AA080015,202002AD080001-5)
云南省基础研究计划(202001AS070014)
云南省学术和技术带头人后备人才(202105AC160018)。
关键词
推荐算法
图卷积神经网络
规范化层
自注意力机制
recommendation algorithm
graph convolutional neural network
normalization layer
self-attention mechanism