期刊文献+

最大子分类间隔准则的核函数优化

Kernel optimization approach based on maximum subclass margin criterion
原文传递
导出
摘要 在分析现有基于经验特征空间核函数优化方法局限性的基础上,提出一种基于最大子分类间隔准则的核函数优化方法。该方法首先建立最大子分类间隔准则,然后结合数据在经验特征空间中的特点给出样本数据的类间散布矩阵和类内散布矩阵的表达式,最后利用奇异值分解实现核函数参数的优化选取。本文利用UCI(University of California,Irvine)数据对算法进行仿真实验,仿真结果表明了本文方法的正确性和有效性。 In order to deal with the kernel optimization, a new kernel data-dependent optimizaition kernel approach based on maximum subclass margin criterion is proposed. In this scheme, a maximum subclass margin function is created firstly. Then, the in-between-subclass and inter-subclass scatter matrix in the empirical feature space are defined. Finally, the optimal coefficients vector is solved by the selected optimization criterion. Experimental results based on UCI data show that it is effective and feasible.
出处 《中国图象图形学报》 CSCD 北大核心 2012年第12期1509-1515,共7页 Journal of Image and Graphics
基金 国家自然科学基金项目(61032001 61102167)
关键词 核函数 核优化 最大子分类间隔准则 目标识别 kernel function kernel optimization maximum subclass margin criterion target recognition
  • 相关文献

参考文献17

  • 1孔锐,施泽生,郭立,张国宣.利用组合核函数提高核主分量分析的性能[J].中国图象图形学报(A辑),2004,9(1):40-45. 被引量:22
  • 2Cristianinni N, KandolaJ, Elisseeff A. On kernel target align?ment [J].Journal of Machine Learning Research, 2002, 4 ( 1 ) : 367-373.
  • 3Amari S, Wu S. Improving support vector machine classifier by modifying kernel function[J]. Neural Network, 1999, 12(6): 738-789.
  • 4Yang Y H. Adaptive credit scoring with kernel learning methods [J]. EuropeanJournal of Operational Research, 2007, 183 (3) : 1521-1536.
  • 5Yeung D Y, Chang H. A kernel approach for semi-supervised metric learning [J]. IEEE Trans. on Neural Network, 2007, 18(1) :141-149.
  • 6Bach F R, Lanckriet G R,Jordan M I. Multiple kernel learning , conic duality, and the SMO algorithm [C] IIProcessings of the 21 st International Conference on Machine Learning. Banff, Cana?da: ACM,Press, 2004, 41-48.
  • 7Yeung D Y, Chang H, Dai G. Learning the kernel matrix by maximizing a KFD-based class separability criterion [J]. Pattern Recognition, 2007,40(7) :2021-2028.
  • 8陈渤,刘宏伟,保铮,曹雪菲.一种针对雷达高分辨距离像识别的融合核优化算法[J].电子学报,2006,34(6):1146-1151. 被引量:10
  • 9Zhang D Q, Chen S C, Zhou Z H. Learning the kernel parame?ters in kernel minimum distance classifer [J]. Pattern Recogni?tion, 2006, 39(1) :133-135.
  • 10Ong C S, Smola AJ, Williamson R C. Learning the kernel with hyper-kernels[J].J. Mach. Learn, 2005, 6(12) :1001-1030.

二级参考文献16

  • 1Vapnik V N. The nature of statistical learning theory[M]. New York : Springer Verlag, 1995.
  • 2Muller K R, Mika S, Ratsch G, et al. An introduction to kernel-based learning algorithms [J]. IEEE Transactions on Neural Networks, 2001, 12(2) : 181-201.
  • 3Mika S, Scholkopf B, Smola A J, et al. Kernel PCA and denoising in feature spaces[A]. In:Kearns M S, Solla S A, Cohn D A,Eds. Advances in Neural Information Processing Systems 11[M], Cambridge, MA USA: MIT Press, 1999:536-542.
  • 4Scholkopf B, Smola A J, Muller K R. Non-linear component analysis as a kernel eigenvalue problem[J]. Neural Network,1998,10:1299-1319.
  • 5Scholkopf B, Mika S, Burges C J C, et al. Input space versus feature space in kernel-based methods[J]. IEEE Transactions on Neural Networks, 1999,10(5) : 1000-1017.
  • 6Smola A J. Learning with kernels[D]. Technische Universitat,Berlin, German, 1998.
  • 7Scholkopf B. The kernel trick for distances [R]. Technical Report MSR-TR-2000-51,Microsoft Research, 19 May 2000.
  • 8Burges C J C. A tutorial on support vector machines for pattern recognition[J]. Knowledge Discovery and Data Mining, 1998,2(2) :121-167.
  • 9Hsu Chih-wei, Lin Chih-jen. A comparison of methods for multiclass support vector machines [J]. IEEE Transactions on Neural Networks, 2002,13(2) : 415-425.
  • 10B Chen,H W Liu,Z Bao.PCA and kernel PCA for radar high range resolution profiles recognition[ A ].The Proc of 2005 IEEE International Radar Conference[ C ].USA:IEEE Press,2005.528-533.

共引文献29

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部