期刊文献+

基于对偶随机投影的线性核支持向量机 被引量:1

Linear kernel support vector machine based on dual random projection
下载PDF
导出
摘要 针对大型支持向量机(SVM)经随机投影特征降维后分类精度下降的问题,结合对偶恢复理论,提出了面向大规模分类问题的基于对偶随机投影的线性核支持向量机(drp-LSVM)。首先,分析论证了drp-LSVM相关几何性质,证明了在保持与基于随机投影降维的支持向量机(rp-LSVM)相近几何优势的同时,其划分超平面更接近于用全部数据训练得到的原始分类器。然后,针对提出的drp-LSVM快速求解问题,改进了传统的序列最小优化(SMO)算法,设计了基于改进SMO算法的drp-LSVM分类器。最后实验结果表明,drp-LSVM在继承rp-LSVM优点的同时,减小了分类误差,提高了训练精度,并且各项性能评价更接近于用原始数据训练得到的分类器;设计的基于改进SMO算法的分类器不但可以减少内存消耗,同时可以拥有较高的训练精度。 Aiming at the low classification accuracy problem of large-scale Support Vector Machine (SVM) after random- projection-based feature dimensionality reduction, Linear kernel SVM based on dual random projection (drp-LSVM) for large- scale classification problems was proposed with the introduction of the dual recovery theory. Firstly, the relevant geometric properties of drp-LSVM were analyzed and demonstrated. It's proved that, with maintaining the similar geometric advantages of Linear kernel SVM based on dual random projection ( rp-LSVM), the divided hyperplane of drp-LSVM was more close to the primitive classifier trained by complete data. Then, in view of the fast solution to drp-LSVM, the traditional Sequential Minimal Optimization (SMO) algorithm was improved and the drp-LSVM classifier based on improved SMO algorithm was completed. Finally, the experimental resuhs show that, drp-LSVM inherits the advantages of rp-LSVM, reduces classification error, improves training accuracy, and all its performance indexes are more close to the classifier trained by primitive data; the classifier designed based on the improved SMO algorithm can reduce memory consumption and achieve higher training accuracy.
出处 《计算机应用》 CSCD 北大核心 2017年第6期1680-1685,共6页 journal of Computer Applications
基金 国家自然科学基金资助项目(71503260) 陕西省自然科学基金资助项目(2014JM8345)~~
关键词 机器学习 支持向量机 随机投影 序列最小优化算法 降维 machine learning Support Vector Machine (SVM) random projection Sequential Minimal Optimization(SMO) algorithm dimensionality reduction
  • 相关文献

参考文献2

二级参考文献34

  • 1FAZEL M, HINDI H, BOYD S. A rank minimization heuristic with application to minimum order system approximation[ C]// Proceed- ings of the 2001 American Control Conference. Evanston, IL: Amer- ican Automatic Control Council, 2001,6:4734 - 4739.
  • 2LINIAL N, LONDON E, RABINOVICH Y. The geometry of graphs and some of its algorithmic applications[ J]. Combinatorica, 1995, 15(2) : 215 - 245.
  • 3CANDIES E, RECHT B. Exact matrix completion via convex optimi- zation[ J]. Foundations of Computational Mathematics, 2009, 9(6) : 717 -772.
  • 4CAI J, CANDES E, SHEN Z. A singular value thresholding algo- rithm for matrix completion [ J]. SIAM Journal on Optimization,2010, 20(4) : 1956 - 1982.
  • 5MA S, GOLDFARB D, CHEN L. Fixed point and Bregman iterative methods for matrix rank minimization[J]. Mathematical Program- ming, 2011, 128(1) : 321 - 353.
  • 6LIN Z, GANESH A, WRIGHT J, et al. Fast convex optimization algorithms for exact recovery of a corrupted low-rank matrix [ EB/ OL]. [ 2012-03-25]. http://yima, csl. uiuc. edu/psfile/rpca algo- rithms, pdf.
  • 7CHEN C, HE B, YUAN X. Matrix completion via an alternating di- rection method[ J]. IMA Journal of Numerical Analysis, 2012, 32 (1) : 227 - 245.
  • 8LEWIS A. The mathematics of eigenvalue optimization[ J]. Mathe- matical Programming, 2003, 97(1) : 155 - 176.
  • 9WATSON G. Characterization of the subdifferential of some matrix norms[ J]. Linear Algebra and its Applications, 1992, 170:33 - 45.
  • 10BECK A, TEBOULLE M. A fast iterative shfinkage-thresholding algorithm for linear inverse problems[ J]. SIAM Journal on Imaging Sciences, 2009, 2(1):183-202.

共引文献8

同被引文献8

引证文献1

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部