期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Distilling base-and-meta network with contrastive learning for few-shot semantic segmentation
1
作者 Xinyue Chen Yueyi Wang +1 位作者 yingyue xu Miaojing Shi 《Autonomous Intelligent Systems》 EI 2023年第1期1-11,共11页
Current studies in few-shot semantic segmentation mostly utilize meta-learning frameworks to obtain models that can be generalized to new categories.However,these models trained on base classes with sufficient annotat... Current studies in few-shot semantic segmentation mostly utilize meta-learning frameworks to obtain models that can be generalized to new categories.However,these models trained on base classes with sufficient annotated samples are biased towards these base classes,which results in semantic confusion and ambiguity between base classes and new classes.A strategy is to use an additional base learner to recognize the objects of base classes and then refine the prediction results output by the meta learner.In this way,the interaction between these two learners and the way of combining results from the two learners are important.This paper proposes a new model,namely Distilling Base and Meta(DBAM)network by using self-attention mechanism and contrastive learning to enhance the few-shot segmentation performance.First,the self-attention-based ensemble module(SEM)is proposed to produce a more accurate adjustment factor for improving the fusion of two predictions of the two learners.Second,the prototype feature optimization module(PFOM)is proposed to provide an interaction between the two learners,which enhances the ability to distinguish the base classes from the target class by introducing contrastive learning loss.Extensive experiments have demonstrated that our method improves on the PASCAL-5i under 1-shot and 5-shot settings,respectively. 展开更多
关键词 Semantic segmentation Few-shot learning Meta learning Contrastive learning Self-attention
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部