期刊文献+

从希尔伯特-施密特独立性中学习的多标签半监督学习方法 被引量:4

Multi-label semi-supervised learning method learnt from Hilbert-Schmidt independence criterion
下载PDF
导出
摘要 基于希尔伯特-施密特独立性提出了一种新的半监督学习方法,称为最大化依赖性多标签半监督学习方法(dependence maximization multi-label semi-supervised learning method,DMMS)。该方法将样本已有标签作为约束,以最大化特征集和标签集的关联性为目标,通过求解一个线性系统为无标签数据打上标签,具有实现简单,无参(nonparameter)的特点。多个真实多标签数据库的实验表明,DMMS与最好的多标签学习方法,包括多标签近邻(multi-label k-nearest neighbor,MLKNN)和图半监督学习方法具有类似的识别效果。 Hilbert-Sehmidt independence criterion (HSIC) can be used to measure the correlation degree of feature set and label set of samples. On the basis of HSIC, this paper presents a new semi-supervised learning method called dependence maximization multi-label semi-supervised learning method (DMMS). By setting the existing labels as constraint and dependence of features and labels as optimization objective, the method solves a linear system to get the labels for unlabeled samples, possessing the features of simple implementation and no parameter estimation. Experiments on some real multi-label datasets show that the proposed method is as good as the state-of-the-art multi-label learning methods in recognition tasks, including multi-label k-nearest neigh- bor (MLKNN) and graph based semi-supervised learning method.
出处 《中国科技论文》 CAS 北大核心 2013年第10期998-1002,共5页 China Sciencepaper
基金 海南省教育厅高等学校科学研究资助项目(Hjkj2012-01) 国家自然科学基金资助项目(11261015)
关键词 希尔伯特-施密特独立性 多标签学习 半监督学习 Hilbert-Schmidt independence criterion multi-label learning semi-supervised learning
  • 相关文献

参考文献16

  • 1Zhou Dengyong, Bousquet O, Lal T N, et al. Learning with local and global consistency [C]// Auer P and Meir P. 18th Annual Conf on Neural Information Pro- cessing Systems. Cambridge MIT Press, 2004: 321 328.
  • 2Wang Fei, Wang Jingdong, Zhang Changshui, et al. Semi-supervised classification using linear neighborhood propagation [C]//Raid Hammond. IEEE Conference onComputer Vision and Pattern Recognition. New York City: IEEE, 2006: 160-167.
  • 3Chapelle O, Scholkopf B, Zien A. Semi-Supervised Learning [M]. Cambridge: MIT Press, 2006: 333-341.
  • 4Read J, Pfahringer B, Holmes G, et al. Classifier chains for multi-label classification [J]. Math Learn, 2011, 85(3): 333-359.
  • 5Tsoumakas G, Ioannis V. Random k-labelsets for multi-label classification [J]. IEEE Trans Knowl Data Eng, 2011, 23(7): 1079-1089.
  • 6Zhang Minling, Zhou Zhihua. A k-nearest neighbor based algorithm for multi-label classification [C]//An- drzej Skowron. 1st IEEE International Conference on Granular Computing. NY: IEEE, 2005:718-721.
  • 7Zha Zhengjun, Mei Tao, Wang Jingdong, et al. Graph- based semi-supervised learning with multiple labels [J]. J Visual Commun Image Represent, 2009, 20 (2): 97-103.
  • 8Chiang T H, Lo Hung Yi, Lin Shoude. A ranking- based KNN approach for multi-Label classification [J]. _1 Mach Learn Res-Proc Track, 2012, 25: 81-96.
  • 9Hardoon D R, Szedmak S, Taylor J S. Canonical corre- lation analysis., an overview with application to learning methods [J]. Neural Comput, 2004, 16 ( 12): 2639-2664.
  • 10Gretton A, Smola A, Bousquet O, et al. Kernel con- strained eovariance for dependence measurement [C]// Ghahramani Z, Cowell tC 10th International Workshop oru Artificial Intelligence and Statistics. USA: Society for Artificial Intelligence and Statistics, 2005: 12-23.

同被引文献43

引证文献4

二级引证文献10

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部