期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
A Deep Model for Partial Multi-label Image Classification with Curriculum-based Disambiguation
1
作者 Feng Sun Ming-Kun Xie sheng-jun huang 《Machine Intelligence Research》 EI CSCD 2024年第4期801-814,共14页
In this paper,we study the partial multi-label(PML)image classification problem,where each image is annotated with a candidate label set consisting of multiple relevant labels and other noisy labels.Existing PML metho... In this paper,we study the partial multi-label(PML)image classification problem,where each image is annotated with a candidate label set consisting of multiple relevant labels and other noisy labels.Existing PML methods typically design a disambiguation strategy to filter out noisy labels by utilizing prior knowledge with extra assumptions,which unfortunately is unavailable in many real tasks.Furthermore,because the objective function for disambiguation is usually elaborately designed on the whole training set,it can hardly be optimized in a deep model with stochastic gradient descent(SGD)on mini-batches.In this paper,for the first time,we propose a deep model for PML to enhance the representation and discrimination ability.On the one hand,we propose a novel curriculum-based disambiguation strategy to progressively identify ground-truth labels by incorporating the varied difficulties of different classes.On the other hand,consistency regularization is introduced for model training to balance fitting identified easy labels and exploiting potential relevant labels.Extensive experimental results on the commonly used benchmark datasets show that the proposed method significantlyoutperforms the SOTA methods. 展开更多
关键词 Partial multi-label image classification curriculum-based disambiguation consistency regularization label difficulty candidatelabel set.
原文传递
Incremental Multi-Label Learning with Active Queries 被引量:3
2
作者 sheng-jun huang Guo-Xiang Li +1 位作者 Wen-Yu huang Shao-Yuan Li 《Journal of Computer Science & Technology》 SCIE EI CSCD 2020年第2期234-246,共13页
In multi-label learning,it is rather expensive to label instances since they are simultaneously associated with multiple labels.Therefore,active learning,which reduces the labeling cost by actively querying the labels... In multi-label learning,it is rather expensive to label instances since they are simultaneously associated with multiple labels.Therefore,active learning,which reduces the labeling cost by actively querying the labels of the most valuable data,becomes particularly important for multi-label learning.A good multi-label active learning algorithm usually consists of two crucial elements:a reasonable criterion to evaluate the gain of querying the label for an instance,and an effective classification model,based on whose prediction the criterion can be accurately computed.In this paper,we first introduce an effective multi-label classification model by combining label ranking with threshold learning,which is incrementally trained to avoid retraining from scratch after every query.Based on this model,we then propose to exploit both uncertainty and diversity in the instance space as well as the label space,and actively query the instance-label pairs which can improve the classification model most.Extensive experiments on 20 datasets demonstrate the superiority of the proposed approach to state-of-the-art methods. 展开更多
关键词 ACTIVE LEARNING MULTI-LABEL LEARNING uncertainty DIVERSITY
原文传递
Multi-label active learning by model guided distribution matching 被引量:4
3
作者 Nengneng GAO sheng-jun huang Songcan CHEN 《Frontiers of Computer Science》 SCIE EI CSCD 2016年第5期845-855,共11页
Multi-label learning is an effective framework for learning with objects that have multiple semantic labels, and has been successfully applied into many real-world tasks, In contrast with traditional single-label lear... Multi-label learning is an effective framework for learning with objects that have multiple semantic labels, and has been successfully applied into many real-world tasks, In contrast with traditional single-label learning, the cost of la- beling a multi-label example is rather high, thus it becomes an important task to train an effective multi-label learning model with as few labeled examples as possible. Active learning, which actively selects the most valuable data to query their labels, is the most important approach to reduce labeling cost. In this paper, we propose a novel approach MADM for batch mode multi-label active learning. On one hand, MADM exploits representativeness and diversity in both the feature and label space by matching the distribution between labeled and unlabeled data. On the other hand, it tends to query predicted positive instances, which are expected to be more informative than negative ones. Experiments on benchmark datasets demonstrate that the proposed approach can reduce the labeling cost significantly. 展开更多
关键词 multi-label learning batch mode active learning distribution matching
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部