期刊文献+

基于正交多项式核函数方法 被引量:2

Review of Chebyshev Kernel Functions
下载PDF
导出
摘要 针对实际工程中小样本数据的稀疏性、分布特征不明显等问题,分析了现有的一些方法并指出了现有方法存在的问题,重点讨论了一类基于切比雪夫多项式的核方法。由于切比雪夫多项式的正交性,使得这些核函数在高维特征空间能得到更优的超平面。通过实验测试了这一类核函数的泛化性能以及学习效率。证明它们比其它的核函数需要更少的支持向量并能保证更好的学习性能。最后论文讨论了这类核函数方法存在的问题,并指出切比雪夫多项式核函数在解决小样本回归问题时具有很大的潜力,值得进一步研究。 In practical engineering ,small-scale data sets are usually sparse and contaminated by noise. Analyze some new methods and their problem. Furthermore, discuss the Chebyshev kernel functions which were proposed recently. Because of the orthogonality of Cbeby- shev polynomials,the new kernels can find the best hyperplane in the feature space. To evaluate the perfomaance of the new kernels,ap- plied it to learn some benchmark data sets, and compared them with other conventional SVM kernels. The experiment results show that the Chebyshev kernels have excellent generalization performance and prediction accuracy, and do not cost much less support vectors compared with other kernels. Point out the problem of the new kernels and the research direction.
出处 《计算机技术与发展》 2012年第5期177-179,184,共4页 Computer Technology and Development
基金 国家自然科学基金(61173040)
关键词 切比雪夫多项式 回归问题 小样本 泛化能力 Chebyshev polynomials regression problem small scale data set generalization performance
  • 相关文献

参考文献15

  • 1Vapnik V N. The Nature of Statistical Learning Theory [ M ]. New York : Springer Verlag, 1995.
  • 2Suykens J A K, Vandewalle J. Least Squares Support Vector Machine Classifiers[ J ]. Neural Processing Letters, 1999 ( 9 ) : 293-300.
  • 3Fung G, Mangasarian O L. Proximal support vector machine classifiers[ C ]//Proceedings of the Seventh ACM SIGKDD In- ternational Conference on Knowledge Discovery and Data Min- ing. New York : ACM ,2001:77-86.
  • 4Huang H P, Liu Y H. Fuzzy support vector machines for pat- tern recognition and data mining [ J ]. International Journal of Fuzzy Systems,2002 (4) : 826-835.
  • 5Lee K Y, Dae-Won K. Possibilistic support vector machines [ J ]. Pattern Recognition ,2005,38 ( 8 ) : 1325-1327.
  • 6Rostamizadeh A. Theoretical Foundations and Algorithms for Learning with Multiple Kernels[ D ]. New York:New York U- niversity ,2010.
  • 7Kloft M ,Brefeld U ,Sonnenburg S, et al. Non-sparse Regular- ization for Multiple Kernel Learning [ R ]. USA : Cornell Uni- versity ,2010.
  • 8Cortes C, Mohri M, Rostamizadeh A. Two-stage Learning Ker- nel Algorithms [ C ]//Proceedings of the 27th International Conference on Machine Learning. Haifa, Israel: [ s. u. ],2010.
  • 9Cristianini N, Kandola J, Elisseeff A, et al. On Kernel Target Alignment [ J ]. Advances in Neural Information Processing Systems,2002 (14) :367-373.
  • 10Kandola J, Shawe-Taylor J, Cristianini N. Optimizing Kernel Alignment over Combinations of Kernels [ R ]. London:Depart- ment of Computer Science, Royal Holloway, University of Lon- don, UK,2002.

同被引文献8

引证文献2

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部