期刊文献+

基于标记与特征依赖最大化的弱标记集成分类 被引量:3

Ensemble Weak-Label Classification by Maximizing Dependency Between Label and Feature
下载PDF
导出
摘要 弱标记学习是多标记学习的一个重要分支,近几年已被广泛研究并被应用于多标记样本的缺失标记补全和预测等问题.然而,针对特征集合较大、更容易拥有多个语义标记和出现标记缺失的高维数据问题,现有弱标记学习方法普遍易受这类数据包含的噪声和冗余特征的干扰.为了对高维多标记数据进行准确的分类,提出了一种基于标记与特征依赖最大化的弱标记集成分类方法 En WL.En WL首先在高维数据的特征空间多次利用近邻传播聚类方法,每次选择聚类中心构成具有代表性的特征子集,降低噪声和冗余特征的干扰;再在每个特征子集上训练一个基于标记与特征依赖最大化的半监督多标记分类器;最后,通过投票集成这些分类器实现多标记分类.在多种高维数据集上的实验结果表明,En WL在多种评价度量上的预测性能均优于已有相关方法. Weak label learning is an important sub-branch of multi-label learning which has been widely studied and applied in replenishing missing labels of partially labeled instances or classifying new instances.However,existing weak label learning methods are generally vulnerable to noisy and redundant features in high-dimensional data where multiple labels and missing labels are more likely present.To accurately classify high-dimensional multi-label instances,in this paper,an ensemble weak label classification method is proposed by maximizing dependency between labels and features (EnWL for short).EnWL first repeatedly utilizes affinity propagation clustering in the feature space of high-dimensional data to find cluster centers.Next,it uses the obtained cluster centers to construct representative feature subsets and to reduce the impact of noisy and redundant features.Then,EnWL trains a semi-supervised multi-label classifier by maximizing the dependency between labels and features on each feature subset.Finally,it combines these base classifiers into an ensemble classifier via majority vote.Experimental results on several high-dimensional datasets show that EnWL significantly outperforms other related methods across various evaluation metrics.
作者 谭桥宇 余国先 王峻 郭茂祖 TAN Qiao-Yu;YU Guo-Xian;WANG Jun;GUO Mao-Zu(College of Computer and Information Science, Southwest University, Chongqing 400715, China;School of Electrical and Information Engineering, Beijing University of Civil Engineering and Architecture, Beijing 100044, China)
出处 《软件学报》 EI CSCD 北大核心 2017年第11期2851-2864,共14页 Journal of Software
基金 国家自然科学基金(61402378 61571163 61532014 61671189) 重庆市基础与前沿研究项目(cstc2014jcyj A40031 cstc2016jcyj A0351)~~
关键词 弱标记学习 高维数据 特征子集 依赖最大化 集成分类 weak label learning high-dimensional data feature subset dependency maximization ensemble classification
  • 相关文献

参考文献5

二级参考文献90

  • 1姜远,周志华.基于词频分类器集成的文本分类方法[J].计算机研究与发展,2006,43(10):1681-1687. 被引量:22
  • 2薛晓冰,韩洁凌,姜远,周志华.基于多示例学习技术的Web目录页面链接推荐[J].计算机研究与发展,2007,44(3):406-411. 被引量:6
  • 3Schapire R E, Singer Y. Boostexter: A boosting-based system for text categorization [J]. Machine Learning, 2000, 39(2/3) : 135-168
  • 4McCallum A. Multi-label text classification with a mixture model trained by EM [C]//Working Notes of the AAAI'99 Workshop on Text Learning. Menlo Park, CA.-AAAI Press, 1999
  • 5Ueda N, Saito K. Parametric mixture models for multilabeled text [C]//Beeker S, Thrun S, Obermayer K. Advances in Neural Information Processing Systems 15 (NIPS'02). Cambridge, MA:MIT Press, 2003:721-728
  • 6De Comite F, Gilleron R, Tommasi M. Learning multi label alternating decision trees from texts and data [C] //Proc of the 3rd Int Conf on Machine Learning and Data Mining in Pattern Recognition (MLDM'03). Berlin: Springer, 2003: 35-49
  • 7Zhang M-L, Zhou Z-H. Multi-label neural networks with applications to functional genomics and text categorization[J]. IEEE Trans on Knowledge and Data Engineering, 2006, 18(10): 1338-1351
  • 8Zhang M L, Zhou Z-H. ML-kNN: A lazy learning approach to multi-label learning [J]. Pattern Recognition, 2007, 40 (7) : 2038-2048
  • 9Elisseeff A, Weston J. A kernel method for multi-labelled classification [C]//Dietterich T G, Becker S, Ghahramani Z. Advances in Neural Information Processing Systems 14 (NIPS'01). Cambridge, MA: MIT Press, 2002:681-687
  • 10Boutell M R, Luo J, Shen X, et al. Learning multi-label scene classification [J]. Pattern Recognition, 2004, 37(9): 1757-1771

共引文献200

同被引文献26

引证文献3

二级引证文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部