期刊文献+

TopN成对相似度迁移的三元组跨模态检索

Triplet Cross-Modal Retrieval Based on TopN Pairwise Similarity Transfer
下载PDF
导出
摘要 随着科技的快速发展,网络上的信息呈现出多模态共存的特点,如何存储和检索多模态信息成为当前的研究热点。其中,跨模态检索就是使用一种模态数据去检索语义相关的其它模态数据。目前大部分研究都聚焦于如何在公共子空间中使相关的样本尽可能靠近,不相关的样本尽可能分离,没有过多考虑相关样本的排序情况。因此提出一种TopN成对相似度迁移的三元组跨模态检索方法,其利用三元组损失和局部保持投影构建多模态共享的公共子空间,同时将原始空间中样本之间的高相似度关系迁移到公共子空间,以构建合理的排序约束。最后在两个经典跨模态数据集上证明了方法的有效性。 With the rapid development of science and technology, information on the Internet shows the characteristics of multi-modal coexistence. How to store and retrieve multi-modal information has become a current research hotspot. Cross-modal retrieval is to use one type of modal data to retrieve semantically related data of other modalities. Most of the current research focuses on how to bring related samples as close as possible and how to separate unrelated samples as much as possible in the common subspace, but ignores the ranking of related samples. Therefore, a triplet cross-modal retrieval method based on TopN pairwise similarity transfer is proposed. It uses triplet loss and Locality Preserving Projections to construct a multi-modal shared common subspace. Meanwhile, it transfers the high similarity relation from origin subspace to common subspace to construct reasonable ordering constraints. Finally, the effectiveness of the method is proved on two classical cross-modal datasets.
出处 《计算机科学与应用》 2021年第10期2529-2537,共9页 Computer Science and Application
  • 相关文献

参考文献2

二级参考文献2

共引文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部