期刊文献+

关于学习空间及分类风险评价的非单调一致性

About Learning Space and Non-Monotonicity in Assessment of Classification Risk
下载PDF
导出
摘要 允许经验风险不为0是现代模式分类器构造方法区别于传统模式分类器构造方法的标志.为了进一步研究分类器构造观点的变化对模式分类系统所产生的更深入的影响,拓展模式分类系统的学习空间,作者讨论了限制经验风险必须为0的传统模式分类系统在分类性能问题上所受的限制,分析了影响模式分类系统分类性能的关键因素,给出了学习空间可拓展的必要条件,并构造了一种投机学习方法,证明了学习空间可拓展的充分条件.同时,在实验中观察到,分类器评价与测试集上的分类风险是非一致单调的.这一结论对于模式识别及其应用研究是严峻的. The characteristic of modern pattern classification methods is to admit empirical risk non-zero, whereas inseparable feature set never provides a chance for learning algorithms to make a classifier with zero of empirical risk. In order to investigate the potential connection between in- separable feature set, which is usually thought as trustless on intuition, and the modern idea on learning problem, this paper argues the necessary condition of the availability of inseparable feature set, by which elaborates an opportunistic learning method to validate the sufficient condition experimentally. Experimental evidences show that inseparable feature subset can make important contributions for improving the performance of pattern classifier. Further more, the relation between the assessment of classification and the predictive performance on test set is proved to be non-monotone in experiments. Both the analytical results and experimental studies reflect that this conclusion may be a challenge to pattern classification and its applications in the future.
作者 何劲松
出处 《计算机学报》 EI CSCD 北大核心 2007年第2期168-175,共8页 Chinese Journal of Computers
基金 多媒体计算与通信教育部-微软重点实验室科研基金(05071808) 国家自然科学基金(60573170)资助~~
关键词 模式分类 机器学习 特征选择 混合学习 经验风险 实际风险 pattern classification machine learning feature selection hybrid learning empirical risk practical risk
  • 相关文献

参考文献9

  • 1Patrick M,Pazzani M J.Exploring the decision forest//Proceedings of the Computational Learning and Natural Learning Workshop.Provincetown Massachusetts,1993:10-12
  • 2Blummer A,Ehrenfeucht A,Haussler D,Warmuth M K.Occam's Razor.Information Processing Letters,1987,24:377-380
  • 3Freund Y,Schapire R E.Experiments with a new boosting algorithm//Proceedings of the 13th International Conference on Machine Learning.San Francisco:Morgan Kaufmann,1996:148-156
  • 4Breiman L.Bagging predictors.Machine Learning,1996,24:123-140
  • 5Qualian J R.Bagging,Boosting,and C4.5//Proceedings of the 13th National Conference Artificial Intelligence.Portland,Oregon,1996:725-730
  • 6Vapnik V N.The Nature of Statistical Learning Theory.New York:John Wiley & Sons,1996
  • 7Jain A K,Duin R P W,Mao Jian-Chang.Statistical pattern recognition:A Review.IEEE Transactions on Pattern Analysis and Machine Intelligence,2000,22(1):4-37
  • 8Duda R O,Hart P E,Stork D G.Pattern Classification (Second Edition).New York:John Wily & Sons,2001
  • 9University of California.Irvine Repository of Machine Learning database,obtainable by anonymous FTP to ftp.ics.uci.edu in the /pub/machine-learning-databases directory.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部