期刊文献+

样例约简支持向量机 被引量:2

Instance Reduction Support Vector Machine
下载PDF
导出
摘要 支持向量机(support vector machine,SVM)仅利用靠近分类边界的支持向量构造最优分类超平面,但求解SVM需要整个训练集,当训练集的规模较大时,求解SVM需要占用大量的内存空间,寻优速度非常慢。针对这一问题,提出了一种称为样例约简的寻找候选支持向量的方法。在该方法中,支持向量大多靠近分类边界,可利用相容粗糙集技术选出边界域中的样例,作为候选支持向量,然后将选出的样例作为训练集来求解SVM。实验结果证实了该方法的有效性,特别是对大型数据库,该方法能有效减少存储空间和执行时间。 In support vector machine (SVM), the optimal classification hyperplane is constructed only from a subset of samples (support vectors) near the boundary. However, solving SVM is based on whole training set, when the training set is very large, it will take a long time to search the optimal solution and require a great amount of mem- ory. In order to deal with this problem, this paper presents a method named instance reduction for selecting the can- didate support vectors. In the proposed method, almost all support vectors are nearby the boundary of classification, the instances used as candidate support vectors in boundary region can be selected by tolerance rough set technique. The SVM is trained from the selected instances. The experimental results show that the proposed method is effective and can efficiently reduce the computational complexity both of time and space especially on large databases.
出处 《计算机科学与探索》 CSCD 2011年第12期1131-1138,共8页 Journal of Frontiers of Computer Science and Technology
基金 国家自然科学基金No.60903088 河北省自然科学基金No.F2010000323 河北省高校科技重点基金No.ZD2010139~~
关键词 相容粗糙集 样例选择 支持向量机(SVM) 最优分类超平面 统计学习理论 tolerance rough sets instance selection support vector machine (SVM) optimal classification hyper- plane statistical learning theory
  • 相关文献

参考文献23

  • 1Vapnik V N. The nature of statistical learning theory[M]. New York: Springer-Verlag, 1995.
  • 2Burges C J C. A tutorial on support vector machines for pattern recognition[J]. Data Mining and Knowledge Dis- covery, 1998, 2(2): 121-167.
  • 3Smola A J, Sch61kopf B. A tutorial on support vector regre- ssion[J]. Statistics and Computing, 2004, 14(3): 199-222.
  • 4Wu Xindong, Kumar V, Quinlan J R, et al. Top 10 algo-rithms in data mining[J]. Knowledge Information System, 2008, 14(1): 1-37.
  • 5Boser B, Guyon I, Vapnik V. A training algorithm for op- timal margin classifiers[C]//Proceedings of the 5th Annual Workshop on Computational Learning Theory (COLT '92) New York, NY, USA: ACM, 1992:144-152.
  • 6Osuna E, Freund R, Girosi E An improved training algo- rithm for support vector machines[C]//Proceedings of the 1997 IEEE Workshop on Neural Networks for Signal Processing VII. Washington, DC, USA: IEEE Computer Society, 1997: 276-285.
  • 7Platt J C. Sequential minimal optimization: a fast algo- rithm for training support vector machines[M]//Advances in Kernel Methods-Support Vector Learning. Cambridge, MA: MIT Press, 1999.
  • 8Angiulli F, Astorino A. Scaling up support vector ma- chines using nearest neighbor condensation[J]. IEEE Transactions on Neural Networks, 2010, 21(2): 351-357.
  • 9Lee Y J, Huang S Y. Reduced support vector machines: a statistical theory[J]. IEEE Transactions on Neural Net- works, 2007, 18(1): 1-13.
  • 10Guo Gao, Zhang Jiangshe. Reducing examples to accel- erate support vector regression[J]. Pattern Recognition Letters, 2007, 28(16): 2173-2183.

同被引文献8

引证文献2

二级引证文献7

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部