期刊文献+

代价敏感相关向量机

Cost Sensitive Relevance Vector Machine
下载PDF
导出
摘要 相关向量机(RVM)是在稀疏贝叶斯框架下提出的稀疏模型,由于其强大的稀疏性和泛化能力,近年来在机器学习领域得到了广泛研究和应用,但和传统的决策树、神经网络算法及支持向量机一样,RVM不具有代价敏感性,不能直接用于代价敏感学习。针对监督学习中错误分类带来的代价问题,提出代价敏感相关向量分类(CS-RVC)算法,在相关向量机的基础上,通过赋予每类样本不同的误分代价,使其更加注重误分类代价较高的样本分类准确率,使得整体误分类代价降低以实现代价敏感挖掘。实验结果表明,该算法具有良好的稀疏性并能够有效地解决代价敏感分类问题。 Relevance Vector Machine ( RVM) is a sparse model proposed on the basis of sparse Bayesian framework, it has been widely studied and applied in the field of machine learning in recent years because of its strong sparsity and generalization ability. However, like the traditional decision tree, neural network algorithm and support vector machine, RVM does not have the ex-pense of sensitivity, can not be directly used for cost-sensitive learning.To deal with the cost sensitive problem brought by mis-classification in supervised learning, cost-sensitive relevance vector classification( CS-RVC) algorithm was proposed by integrating misclassification cost of each type sample based on RVM.Experiments show that CS-RVC has good sparsity and can effectively solve the problem of cost-sensitive classification.
出处 《计算机与现代化》 2015年第2期19-24,共6页 Computer and Modernization
基金 国家自然科学基金资助项目(61170152)
  • 相关文献

参考文献20

  • 1叶志飞,文益民,吕宝粮.不平衡分类问题研究综述[J].智能系统学报,2009,4(2):148-156. 被引量:72
  • 2Habbema J D F, Hermans J, Van Der Burgt A T.Cases of doubt in allocation problems[J].Biometrika, 1974,61(2):313-324.
  • 3Chawla N V, Bowyer K W, Hall L O, et al.SMOTE:Synthetic minority over-sampling technique[J].Journal of Artificial Intelligence Research, 2002,16:321-357.
  • 4Domingos P Metacost.A general method for making classifiers cost-sensitive[C]// Proceedings of the fifth ACM SIGKDD International Conference on Knowledge Discovery andData Mining.ACM, 1999:155-164.
  • 5Breiman L.Bagging predictors[J].MachineLearning, 1996,24(2):123-140.
  • 6Brefeld U, Geibel P, Wysotzki F.Support vector machines with example dependent costs[M]// Machine Learning: ECML 2003.Springer Berlin Heidelberg, 2003:23-34.
  • 7刘胥影,姜远,周志华.类别不平衡性对代价敏感学习的影响[C]// 人工智能学会第12届学术年会,中国人工智能进展.2007:45-50.
  • 8郑恩辉,李平,宋执环.代价敏感支持向量机[J].控制与决策,2006,21(4):473-476. 被引量:33
  • 9Fumera G, Roli F.Cost-sensitive learning in support vector machines[C]// Workshop on Machine Learning, Methods and Applications, Held in the Context of the 8th Meetingof the Italian Association of Artificial Intelligence.2002.
  • 10曹莹,苗启广,刘家辰,高琳.具有Fisher一致性的代价敏感Boosting算法[J].软件学报,2013,24(11):2584-2596. 被引量:11

二级参考文献28

  • 1周志华.普适机器学习[EB/OL].http://WWW.intsci.ac.cn/research/zhouzh04.ppt,2003
  • 2Drummond C,Holte R.Exploiting the Cost (in)Sensitivity of Decision Tree Splitting Criteria[A].Proc of the 17th Int Conf on Machine Learning[C].San Francisco,2000:239-246.
  • 3Fan W,Stolfo S,Zhang J,et al.AdaCost:Misclassification Cost-sensitive Boosting[A].Proc of the 16th Int Conf on Machine Learning[C].Bled,1999:97-105.
  • 4Zadrozny B,Langford J,Abe N.Cost-sensitive Learning by Cost-proportionate Example Weighting[A].Proc of the 3rd IEEE Int Conf on Data Mining[C].Melbourne,2003.
  • 5Vapnik V N.An Overview of Statistical Learning Theory[J].IEEE Trans on Neural Networks,1999,10(5):988-999.
  • 6Burges C.A Tutorial on Support Vector Machines from Pattern Recognition[J].Data Mining and Knowledge Dlscovery,1998,2(2):121-167.
  • 7Michie,Spiegelhalter D J,Taylor C C.Machine Learning,Neural and Statistical Classification[EB/OL].http://www.ncc.up.pt/liacc/ML/statlog/data.html,2004.
  • 8Han J,Kamber M.Data Mining:Concepts and Techniques[M].San Francisco CA:Morgan Kaufmann,2001.
  • 9Lopez V, Fernandez A, Moreno-Torres JG, Herrera F. Analysis of preprocessing vs. cost-sensitive learning for imbalanced classification. Open problems on intrinsic data characteristics. Expert Systems with Applications, 2012,39(7):6585-6608. [doi: 10. 1016/j.eswa.2011.12.043].
  • 10Lomax S, Vadera S. A survey of cost-sensitive decision tree induction algorithms. ACM Computing Surveys (CSUR), 2013,45(2): 16-50. [doi: 10.1145/2431211.2431215].

共引文献121

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部