期刊文献+

融合超图卷积和自监督协同训练的组推荐算法

Group Recommendation Algorithms Incorporating HypergraphConvolution and Self-supervised Collaborative Training
下载PDF
导出
摘要 随着社交媒体的普及,人们逐渐将研究方向从个人推荐算法转移到群组推荐算法,现有的群组推荐模型大多采用启发式或基于注意力的偏好聚合策略聚合群组成员的个人偏好形成群组偏好。然而,由于用户交互数据的稀疏性,学习后的用户特征并不完备,现实生活中用户的交互非常复杂,而且用户关系有可能是高阶的;再者,群组之间的相似性和群组间共同成员的个人偏好经常被忽视,而群组相似性的学习对于提高群组表征学习具有很大的潜力。针对上述问题,该文设计了一种融合超图卷积和自监督协同训练的组推荐算法(HCSC)。首先,在用户级超图中,利用三个通道编码超图卷积网络中的高阶用户关系,通过聚合多个通道学习的用户特征,获得增强的用户表示,这为学习群组偏好提供了坚实的基础。其次,在组级超图中,将所有的群组连接为重叠网络,并关注群组共同成员的个人偏好,其超边嵌入过程可视为对群组偏好的学习。为进一步增强群组表示,将自监督学习和协同训练相结合,在上述两个超图上构建两个不同的图编码器,递归地利用不同信息生成标注样本,通过对比学习策略互相监督,与丢弃策略相比,所提出的自监督协同训练保留了完整信息,实现了真正的数据增强。该文提出的HCSC模型在两个真实世界的数据集上进行了广泛的实验,实验结果证明了该文提出的HCSC模型的优越性。 With the popularity of social media,people are gradually shifting their research direction from individual recommendation algorithms to group recommendation algorithms.Most existing group recommendation models use heuristic or attention-based preference aggregation strategies to aggregate the individual preferences of group members to form group preferences.To further improve this task,this paper presents a group recommendation algorithm(HCSC)that incorporates hypergraph convolution and self-supervised collaborative training.First,in the user-level hypergraph,three channels are used to encode higher-order user relationships in the hypergraph convolutional network to enhance user representation by aggregating user features learned from multiple channels.Secondly,in group-level hypergraphs,all groups are connected as overlapping networks and the attention is paid to the individual preferences of common group members,in which the process of hyperedge embedding can be considered as the learning of group preferences.Thirdly,to further enhance the cluster representation,self-supervised learning and co-training are combined to construct two different graph encoders on the above two hypergraphs,recursively using annotated samples generated from different information to supervise each other through comparative learning,and the proposed self-supervised co-training retains complete information and achieves true data enhancement compared to the discard strategy.Experiments on two real-world datasets demonstrate the superiority of the proposed HCSC model in this paper.
作者 刘静文 刘渊 袁琮淇 LIU Jingwen;LIU Yuan;YUAN Congqi(School of Artificial Intelligence and Computer Science,Jiangnan University,Wuxi,Jiangsu 214122,China;Jiangsu Key Laboratory of Media Design and Software Technology(Jiangnan University),Wuxi,Jiangsu 214122,China)
出处 《中文信息学报》 CSCD 北大核心 2024年第7期115-126,136,共13页 Journal of Chinese Information Processing
基金 国家自然科学基金(61972182)。
关键词 群组推荐 超图卷积 自监督学习 协同训练 对比学习 group recommendation hypergraph convolution self-supervised learning co-training contrast learning
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部