期刊文献+

结构风险最小化近邻分析解决大规模训练集支持向量机学习问题 被引量:3

A Learning and Classification Scheme of Large Training Set SVM Basd on NN-SRM Analysis
下载PDF
导出
摘要 SVM是利用靠近边界的少数向量来构造最大间隔的分类超平面,当海量样本之间存在相互混迭时,支持向量数目急剧增加,导致训练难度增大。针对该问题,本文将结构风险最小化近邻分析与支持向量机相结合构成了一种新的SVM学习方法。它首先根据各个训练数据的类间最近邻距离利用结构风险最小化近邻分析选择训练子集;在选择的样本子空间内采用乘性规则直接求取Lagrange因子,而不是传统的二次优化方法;最后加入附加剩余样本进行交叉验证处理,直到算法满足收敛性准则。各种分类实验表明本文提出的算法具有良好的性能,特别是在训练样本庞大,支持向量数量较多的情况下,能够较大幅度的减少计算复杂度,提高分类速度。 Support vector machine constructs an optimal hyperplane from a set of samples near the boundary, when some samples intermixed in another class seriously, It will result in the number of support vector increase greatly and the performance of training will become more difficult. To resolve this problem, firstly the structural risk minimization nearest neighbor analysis is introduced to select valid training subspaces. Then a reduced number of sample subspace is extracted for support vector training. In addition, instead of the traditional quadratic programming, muhiplicative update is used to solve Lagrange multiplier in optimization the solution of support vector. The samples of rest are used for cross validating till the algorithm is convergence. Experimental results demonstrate that this method has better performance and overcome the flaw of standard SVM. This algorithm could greatly reduce the computational load and increase the speed of training, especially in the case of large number of training sample.
作者 胡正平 张晔
出处 《信号处理》 CSCD 北大核心 2007年第2期161-164,共4页 Journal of Signal Processing
基金 国家自然科学基金资助(60272073)
关键词 结构风险最小原理 支持向量机 核函数 乘性规则 最近邻分类器 Structural risk minization Support vector machines kernel function Muhiplicative update Nearest neighbor classifter
  • 相关文献

参考文献1

二级参考文献9

  • 1Hearst M.A., Dumais S.T., Osman E., Platt J., Scholkopf B.. Support vector machines. IEEE Intelligent Systems, 1998, 13(4): 18~28
  • 2Vapnik V.N.. An overview of statistical learning theory. IEEE Transactions on Neural Networks, 1999, 10(5): 988~999
  • 3Vapnik V.N.. Statistical Learning Theory.2nd ed..New York: Springer-Verlag, 1999
  • 4Müller Klaus-Robert, Mika Sebastian, Rtsch Gunnar, Tsuda Koji, Schlkopf Bernhard. An introduction to kernel-based learning algorithms. IEEE Transactions on Neural Networks, 2001, 12(2): 181~201
  • 5Burges C.J.C.. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 1998, 2(2): 121~167
  • 6Ke Hai-Xin,Zhang Xue -Gong.Editing support vector machines. In: Proceedings of the International Joint Conference on Neural Networks, Washington, DC, 2001, 2: 1464~1467
  • 7张学工.关于统计学习理论与支持向量机[J].自动化学报,2000,26(1):32-42. 被引量:2256
  • 8张鸿宾,孙广煜.近邻法参考样本集的最优选择[J].电子学报,2000,28(11):16-21. 被引量:8
  • 9李红莲,王春花,袁保宗.一种改进的支持向量机NN-SVM[J].计算机学报,2003,26(8):1015-1020. 被引量:71

共引文献52

同被引文献26

  • 1高全学,潘泉,梁彦,张洪才,程咏梅.基于描述特征的人脸识别研究[J].自动化学报,2006,32(3):386-392. 被引量:13
  • 2苏宏涛.基于统计特征的人脸识别技术研究[D].西安:西北工业大学,2005.
  • 3Vapink V N.统计学理论的本质[M].北京:清华大学出版社,2000.
  • 4Deng J,Ghemawat S.MapReduce:Simplied data processingon large clusters[C]//USENIX.Proceedings of the 6thUSENIX Symposium on Operating System Design andImplementation(OSDI).New York:ACM Press,2004:137-150.
  • 5Cortes C,Vapink V.Support vector networks[J].Machine,1995,20:273-297.
  • 6Vaprink V N.Statistical learning theory[M].New York:Wiley,1998:493-520.
  • 7Krebel U.Pairwise classification and support vectormachine[M].Cambridge,USA:The MIT Press,1999:255-268.
  • 8杨俊燕,张优云,朱永生.ε不敏感损失函数支持向量机分类性能研究[J].西安交通大学学报,2007,41(11):1315-1320. 被引量:17
  • 9Vapnik V. Statistical learning theory[M]. 2nd ed. New York:Springer,2001.
  • 10Cherkassky V,Ma Y. Practical selection of SVM parameters and noise estimation for SVM regression[J]. Neural Networks,2004,17(1):113-126.

引证文献3

二级引证文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部