期刊文献+

中心向量夹角间隔正则化核向量机 被引量:1

Regularized Core Vector Machine with Central Vector-Angular Margin
原文传递
导出
摘要 针对大数据集如何有效地进行训练的问题,基于最大向量夹角间隔分类器(maximum vector-angular margin classifier,MAMC),提出了求解最优向量d的不同方法来得到中心向量夹角间隔分类器(central vector-angular margin classifier,CAMC),进而证明了CAMC等价于最小包围球问题(minimum enclosed ball,MEB).但是鉴于MEB对参数的敏感性,又提出了正则化核向量机(regularized core vector machine,RCVM),将CAMC与RCVM结合得到中心向量夹角间隔正则化核向量机(regularized core vector machine with central vector-angular margin,CAMCVM).基于基准数据集的实验表明,CAMC具有更好的分类性能且CAMCVM可以有效快速地训练大规模数据集. For effective training on large datasets,we propose an alternate method to find the optimal vector,d, using the central vector-angular margin classifier (CAMC),which is based on the maximum vector-angular margin classifier.The CAMC can be considered to be equivalent to the corresponding minimum enclosing ball (MEB)problem.However,we have found that the MEB is very sensitive to the selection of the trade-off pa-rameter,so we propose using a regularized core vector machine (RCVM).By connecting the CAMC to the RCVM,we obtain a central vector-angular margin regularized core vector machine (CAMCVM).Experimen-tal results from the UCI datasets show that the CAMC has a better generalized performance,while the CAM-CVM can be used for effective training on large datasets.
出处 《信息与控制》 CSCD 北大核心 2015年第2期159-164,共6页 Information and Control
基金 国家自然科学基金项目(61170040) 河北省自然科学基金项目(F2015201185 F2013201220)
关键词 最大向量夹角间隔分类器 最小包围球 正则化 核向量机 maximum vector-angular margin classifier minimum enclosing ball regularized core vector machine
  • 相关文献

参考文献17

  • 1Cortes C, Vapnik V N. Support vector networks[ J]. Machine Learning, 1995, 20(3) : 273 -297.
  • 2Chang C C, Lin C J. Training v-support vector classifiers: Theory and algorithms [ J ]. Neural Computation, 2002, 13 (9) : 43 -54.
  • 3Marizio M D, Taylor C C. Kernel density classification and boosting: An L2 analysis [ J ]. Statistics and Computing, 2005, 15 (2) : 113 - 123.
  • 4陶新民,曹盼东,宋少宇,付丹丹.基于半监督高斯混合模型核的支持向量机分类算法[J].信息与控制,2013,42(1):18-26. 被引量:5
  • 5Tax D M J, Duin R P W. Support vector data description[ J]. Machine Learnings, 2004, 54 (1) : 45 -66.
  • 6Wu M R, Ye J P. A small sphere and large margin approach for novehy detection using training data with outlier[ J]. IEEE Transaction on Pattern Analysis and Machine Intelligence, 2009, 31 ( 11 ) : 2088 - 2092.
  • 7Takahashi N, Nishi T. Rigorous proof of termination of SMO algorithm for support vector machines [ J t. IEEE Transactions on Neural Net- works, 2005, 16(3) : 774 -776.
  • 8Smola A, Scholkopf B. Sparse greedy matrix approximation for machine learning[ C ]//Proceedings of the 17th International Conference on Ma- chine Learning. New York, NJ, USA: ACM, 2000:911 -918.
  • 9Achlioptas D, McSherry F, Scholkopf B. Sampling techniques for kernel methods [ C ]//The 2001 Conference on Advances in Neural Informa- tion Processing Systems. 2002 : 335 - 342.
  • 10Fine S, Scheinberg K. Efficient SVM training using low-rank kernel representations [ J ]. Journal of Machine Learning Research, 2001,2 (2) : 243 - 264.

二级参考文献14

  • 1全勇,杨杰.Geodesic Distance for Support Vector Machines[J].自动化学报,2005,31(2):202-208. 被引量:4
  • 2Chapelle O, Sindhwani V, Keerthi S S. Optimization techniques for semi-supervised support vector machines[J].The Journal of Machine Learning Research, 2008, 9(2): 203-233.
  • 3Sindhwani V, Niyogi P, Belkin M. Beyond the point cloud: From transductive to semi-supervised leaming[C]//Proceedings of the 22nd International Conference on Machine Learning. New York, NJ, USA: ACM, 2005: 825-832.
  • 4Belkin M, Niyogi P, Sindhwani V, et al. Manifold regulariza- tion: A geometric framework for learning from examples[J]. Machine Learning Research, 2006, 7(2): 2399-2434.
  • 5Bai S H, Huang C L, Ma B, et al. Semi-supervised learning of language model using unsupervised topic model[C]//IEEE In- ternational Conference on Acoustics, Speech, and Signal Pro- cessing. Piscataway, NJ, USA: IEEE, 2010: 5386-5389.
  • 6Kato T, Kashima H, Sugiyama M. Robust label propagation on multiple networks[J]. IEEE Transactions on Neural Networks, 2009, 20(1): 35-44.
  • 7Kang F, Jin R, Sukthankar R. Correlated label propagation with application to multi-label learning[C]//Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Piscataway, NJ, USA: IEEE, 2006: 1719-1726.
  • 8Wang F, Zhang C. Label propagation through linear neighbor- hoods[J]. IEEE Transactions on Knowledge and Data Engineer- ing, 2008, 20(1): 55-67.
  • 9Wang J, S. Chang F, Zhou X, et al. Active microscopic cellu- lar image annotation by superposable graph transduction with imbalanced labels[C]//IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Piscataway, NJ, USA: IEEE, 2008: 8-12.
  • 10Zhu X J, Ghahramani Z, Lafferty J. Semi-supervised learning using Gaussian fields and harmonic functions[C]//Proceedings of the Twentieth International Conference on Machine Learn- ing. San Francisco, CA, USA: AAAI, 2003: 912-919.

共引文献4

同被引文献5

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部