期刊文献+

泛化的统一切比雪夫多项式核函数 被引量:1

Generalized Uniform Chebyshev Polynomial Kernel
下载PDF
导出
摘要 针对分布稀疏、特征不明显的小样本数据回归中的属性冗余问题,基于统一切比雪夫多项式,提出了一种向量形式输入的可变正交多项式核函数——泛化的统一切比雪夫多项式核函数.新的核函数通过利用统一切比雪夫多项式的正交性和可变性扩大了函数的搜索空间,通过调整多项式阶数有效地控制了特征空间维数,从而解决了稀疏数据回归中的属性冗余问题.另外,利用Mer-cer定理证明了该核函数的有效性.在多组标准数据集和实际工程数据集上对核函数的性能进行了实验对比,结果证明新的核函数预测精度较高,泛化能力较好,在大多数标准数据集上的性能优于其他切比雪夫多项式核函数. Based on a group of unified Chebyshev polynomials (UCP), a new kernel for vector inputs, named generalized uniform Chebyshev polynomial kernel (GUCK), is proposed to solve the problem of redundant attributes in the regression analysis on small-scale data sets. The proposal kernel can extend the search space of optimal kernel function by the orthogonality and adaptivity of UCP and control the dimension of the feature space by adjusting the polynomial coefficient of UCP. The problem of redundant attributes is settled by this method. Moreover, the proposal kernel, GUCK, has been proved that it is a valid support vector machine (SVM) kernel. The simulation results and application results show that GUCK can lead to better generalization performance in comparison with other common kernels, and is well applicable to the practical dataset. The GUCK has an advantage over other Chebyshev kernels on the majority of benchmark data sets
出处 《西安交通大学学报》 EI CAS CSCD 北大核心 2012年第8期43-48,共6页 Journal of Xi'an Jiaotong University
基金 国家自然科学基金资助项目(10776026)
关键词 统一切比雪夫多项式 统一切比雪夫多项式核函数 支持向量机 回归问题 uniform Chebyshev polynomials uniform Chebyshev kernel support vector machine regression problem
  • 相关文献

参考文献21

  • 1VAPNIK V N. The nature of statistical learning theo ry [ M ]. Dusseldorf, Germany.. Springer-Verlag, 1995.
  • 2SUYKENS J A K, VANDEWALLE J. Least squares support vector machine classifiers [J]. Neural Pro cessing Letters, 1999, 9(3) :293-300.
  • 3FUNG G, MANGASARIAN O L. Proximal support vector machine classifiers [C]//Proe of the 7th ACM SIGKDD International Conference on Knowledge Dis- covery and Data Mining. New York, USA: ACM, 2001:77-86.
  • 4HUANG H P, LIU Y H. Fuzzy support vector ma- chines for pattern recognition and data mining [J]. In- ternational Journal of Fuzzy Systems, 2002, 4 (3): 826-835.
  • 5LEE K Y, DAE-WON K. Possibilistic support vector machines [J]. Pattern Recognition, 2005, 38 (8):1325-1327.
  • 6ROSTAMIZADEH A. Theoretical foundations and al- gorithms for learning with multiple kernels [D]. New York, USA: New York University, 2010.
  • 7KLOFT M, BREFELD U, SONNENBURG S, et al. Non-sparse regularization for multiple kernel learning, machine learning group [J]. Franklinstr, 2010, 28 (29) :6-9.
  • 8CORTES C, MOHRI M, ROSTAMIZADEH A. Two-stage learning kernel algorithms [C]//Proc of the 27th International Conference on Machine Learning. Corvallis, USA. ICML,2010:239-246.
  • 9CRISTIANINI N, KANDOLA J, ELISSEEF A, et al. On kernel target alignment [C]//Proc of the Neu ral Information Processing Systems. Cambridge, MA, USA: MIT Press, 2001: 367-373.
  • 10KANDOLA J, TAYLOR J S, CRISTIANINI N. Op timizing kernel alignment over combinations of ker- nels, Neuro COLT Technical Report NC-TR-02-121 [R]. London, UK: University of I.ondon, 2002.

同被引文献6

引证文献1

二级引证文献13

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部