期刊文献+

一种高效的最小二乘支持向量机分类器剪枝算法 被引量:4

An Effective Pruning Algorithm for Least Squares Support Vector Machine Classifier
下载PDF
导出
摘要 针对最小二乘支持向量机丧失稀疏性的问题,提出了一种高效的剪枝算法.为了避免解初始的线性代数方程组,采用了一种自下而上的策略.在训练的过程中,根据一些特定的剪枝条件,块增量学习和逆学习交替进行,一个小的支持向量集能够自动形成.使用此集合,可以构造最终的分类器.为了测试新算法的有效性,把它应用于5个UCI数据集.实验结果表明:使用新的剪枝算法,当增量块的大小等于2时,在几乎不损失精度的情况下,可以得到稀疏解.另外,和SMO算法相比,新算法的速度更快.新的算法不仅适用于最小二乘支持向量机分类器,也可向最小二乘支持向量回归机推广. A well-known drawback in the least squares support vector machine (LS-SVM) is that the sparseness is lost. In this study, an effective pruning algorithm is developed to deal with this problem. To avoid solving the primal set of linear equations, the bottom to the top strategy is adopted in the proposed algorithm. During the training process of the algorithm, the chunking incremental and decremental learning procedures are used alternately. A small support vector set, which can cover most of the information in the training set, can be formed adaptively. Using the support vector set, one can construct the final classifier. In order to test the validation of the proposed algorithm, it has been applied to five benchmarking UCI datasets. In order to show the relationships among the chunking size, the number of support vector machine, the training time, and the testing accuracy, different chunking sizes are tested. The experimental results show that the proposed algorithm can adaptively obtain the sparse solutions without almost losing generalization performance when the chunking size is equal to 2, and also its training speed is much faster than that of the sequential minimal optimization (SMO) algorithm. The proposed algorithm can also be applied to the least squares support vector regression machine as well as LS-SVM classifier.
出处 《计算机研究与发展》 EI CSCD 北大核心 2007年第7期1128-1136,共9页 Journal of Computer Research and Development
基金 Australia Research Council(ARC)基金项目(DP0559213) 广东省自然科学基金项目(031360 04020079) 吉林大学符号计算与知识工程教育部重点实验室开放课题(93K-17-2006-03)
关键词 最小二乘支持向量机 剪枝 块增量学习 逆学习 自适应 least squares support vector machine pruning chunking incremental learning decremental learning adaptive
  • 相关文献

参考文献19

  • 1王玲,薄列峰,刘芳,焦李成.最小二乘隐空间支持向量机[J].计算机学报,2005,28(8):1302-1307. 被引量:12
  • 2肖健华.基于支持对象的野点检测方法[J].计算机工程,2003,29(11):43-45. 被引量:23
  • 3业宁,孙瑞祥,董逸生.多拉格朗日乘子协同优化的SVM快速学习算法研究[J].计算机研究与发展,2006,43(3):442-448. 被引量:2
  • 4J A K Suykens,J Vandewalle.Least squares support vector machine classifiers[J].Neural Processing Letters,1999,9(3):293-300.
  • 5J A K Suykens,J Vandewalle.Recurrent least squares supportvector machines[J].IEEE Trans on Circuits Systems.I,2000,47(7):1109-1114.
  • 6J A K Suykens,J Vandewalle,B De Moor.Optimal control by least squares support vector machines[J].Neural Networks,2001,14(1):23-35.
  • 7T Van Gestel,J A K Suykens,D E Baestaens,et al.Financial time series prediction using least squares support vector machines within the evidence framework[J].IEEE Trans on NeuralNetworks,2001,12(4):809-821.
  • 8J A K Suykens,J De Barhanter,L Lukas,et al.Weighted least squares support vector machines:Robustness and sparse approximation[J].Neurocomputing,2002,48(1-4):85-105.
  • 9T Van Gestel,J A K Suykens,B Baesens,et al.Benchmarking least squares support vector machine classifiers[J].Machine Learning,2004,54(1):5-32.
  • 10W Chu,C J Ong,S S Keerthi.An improved conjugate gradient scheme to the solution of least squares SVM[J].IEEE Trans on Neural Networks,2005,16(2):498-501.

二级参考文献33

  • 1范金城,胡峰.动态测量数据的抗扰性分析研究[J].数理统计与应用概率,1996,11(3):244-248. 被引量:25
  • 2Knorr E M, Ng R T. Algorithms for Mining Distance-based Outiiers in Large Datasets. Proc. VLDB, 1998:392-403.
  • 3Burges C J C. A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery, 1998,2(2):121.
  • 4Moya M R, Koch M W, Hostetler L R. One-class Classifier Networks for Target Recognition Applications. Portland:Proceedings World Congress on Neural Networks, 1993:797-801.
  • 5Tax D, Duin R. Data Domain Description Using Support Vectors.Proc. European Symposium Artificial Neural Networks, 1999:251-256.
  • 6张学工译.统计学习理论的本质[M].北京:清华大学出版社,2000..
  • 7Aronszajn N.. Theory of reproducing kernels. Transactions of the American Mathematical Society, 1950, 68(6): 337~404.
  • 8Burges C.J.C.. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 1998, 2(2): 1~47.
  • 9Smola A.J., Schlkopf B.. A tutorial on support vector regression. Royal Holloway College, University of London, UK, NeuroCOLT Technical Report: NC-TR-98-030, 1998.
  • 10Müller K.R., Smola A.J., Rtsch G. et al.. Predicting time series with support vector machines. In: Schlkopf B., Burges C.J.C., Smola A.J. eds.. Advances in Kernel Methods-Support Vector Learning, Cambridge, MA: MIT Press, 1999, 243~254.

共引文献34

同被引文献38

引证文献4

二级引证文献14

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部