期刊文献+

基于互信息的粒化特征加权多标签学习k近邻算法 被引量:22

Mutual Information Based Granular Feature Weighted k-Nearest Neighbors Algorithm for Multi-Label Learning
下载PDF
导出
摘要 传统基于k近邻的多标签学习算法,在寻找近邻度量样本间的距离时,对所有特征给予同等的重要度.这些算法大多采用分解策略,对单个标签独立预测,忽略了标签间的相关性.多标签学习算法的分类效果跟输入的特征有很大的关系,不同的特征含有的标签分类信息不同,故不同特征的重要度也不同.互信息是常用的度量2个变量间关联度的重要方法之一,能够有效度量特征含有标签分类的知识量.因此,根据特征含有标签分类知识量的大小,赋予相应的权重系数,提出一种基于互信息的粒化特征加权多标签学习k近邻算法(granular feature weighted k-nearest neighbors algorithm for multi-label learning,GFWML-kNN),该算法将标签空间粒化成多个标签粒,对每个标签粒计算特征的权重系数,以解决上述问题和标签组合爆炸问题.在计算特征权重时,考虑到了标签间可能的组合,把标签间的相关性融合进特征的权重系数.实验表明:相较于若干经典的多标签学习算法,所提算法GFWML-kNN整体上能取得较好的效果. All features contribute equally to compute the distance between any pair of instances when finding the nearest neighbors in traditional ^NN based multi-label learning algorithms. Furthermore, most of these algorithms transform the multi-label problem into a set of single-label binary problems, which ignore the label correlation. The performance of multi-label learning algorithm greatly depends on the input features, and different features contain different knowledge about the label classification, so the features should be given different importance. Mutual information is one of the widely used measures of dependency of variables, and can evaluate the knowledge contained in the feature about the label classification. Therefore, we propose a granular feature weighted 是 -nearest neighbors algorithm for multi-label learning based on mutual information, which gives the feature weights according to the knowledge contained in the feature. The proposed algorithm firstly granulates the label space into several label information granules to avoid the problem of label combination explosion problem, and then calculates feature weights for each label information granule, which takes label combinations into consideration to merge label correlations into feature weights. The experimental results show that the proposed algorithm can achieve better performance than other common multi-label learning algorithms.
出处 《计算机研究与发展》 EI CSCD 北大核心 2017年第5期1024-1035,共12页 Journal of Computer Research and Development
基金 国家自然科学基金项目(61273304 61573255) 高等学校博士学科点专项科研基金项目(20130072130004) 上海市自然科学基金项目(14ZR1442600)~~
关键词 互信息 特征权重 粒化 多标签学习 K-近邻 mutual information feature weight granulation multi-label learning k-nearest neighbors
  • 相关文献

参考文献2

二级参考文献66

  • 1Schapire R E, Singer Y. BoosTcxter: A boostnlg bsed syslem for text categorizaion[J]. Machine Lcarning . 39(2/3): 135- 168.
  • 2McCallum A. Muhi-lahcl lext classification with a micture model trained by EM[C] //Proc of *he Working Nolcs of 11/ AAAI'99 Workshop on Text I.earning. Menlo Park, CA: AAAI Press, 1999.
  • 3Elissecff A, Weston J. A kcrtxel method for multi -labeclledclassification [C] //Advances in Neural Informalion Processing Systcms 14. Cambridge, MA: M1T Press, 2002: 681 -687.
  • 4QiGJ, HuaX S, Rui Y, et al. Corrclaativcmulti label vidco annotation [C] //Proc of the 15th ACM Int Conf on Muhimedia. New York: ACM, 2007:17- 26.
  • 5Aha D W. Specied A1 review issoe on lazy learning [J ]. Artificial Intelligcnce Review, 1997. 11(1/2/3/4/5): 7 -10.
  • 6Zhang M L,, Zhou Z H. ML-hNN: A lazy lcarning approach to multi label learning [J]. Paltern Recognition, 2007. 10 (7): 2038 -2048.
  • 7Freund Y, Sc:hapire R E. A dccision theoretic gcncralization of on-linc learning and an applocation to boosting[G]//Lecture Notcs in Computer Scicnce 904.Bcrlin:Springer.1995:23-37.
  • 8Dempstcr A P, 1.aird N M, Rubin D B+ Maxitnuntlikclihood from incomplete data via the EM algorithm[J].Journal of the Royal Statistics Socicty B, 1977, 39(1): 1-38.
  • 9Ueda N, Saito K. Parametric mixturc models for multi label text [C] //Advances in Neural Information Processing Systems 15. Cambridge, MA= MITPress, 2003:721-728.
  • 10Dumais S T, Platt J, Heckerman D, et al. Inductive learning algorithm and representation for text categorization [C]// Proc of the 7th ACM Int Conf on Information and Knowledge Management. New York: ACM, 1998= 148-155.

共引文献43

同被引文献164

引证文献22

二级引证文献84

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部