期刊文献+

一种适于多分类问题的支持向量机加速方法 被引量:5

Acceleration of SVM for Multi-class Classification
下载PDF
导出
摘要 支持向量机因具有卓越的分类效果和坚实的理论基础而成为了近年来模式识别、机器学习以及数据挖掘等领域中最重要的分类方法之一。然而,其训练时间会随样本增多而明显增长,并且在处理多分类问题时模型训练会更加复杂。为解决上述问题,给出了一种适于多分类问题的训练数据快速约简方法MOIS。该方法以聚类中心为参照点,在删除掉冗余训练样本的同时,选择起决定作用的边界样本来大幅度约简训练数据,并消减类别间的分布不均衡问题。实验结果表明,MOIS在保持甚至提高支持向量机分类效果的同时,能大幅提高训练效率。例如,在Optdigit数据集上,利用所提方法使分类准确率由98.94%提高到99.05%的同时,训练时间缩短到原来的15%;又如,在HCL2000前100类构成的数据集上,在准确率略有提高的情况下(由99.29%提高到99.30%),训练时间更是大幅缩短到不足原来的6%。另外,MOIS本身具有很高的运行效率。 With excellent classification effect and solid theoretical foundation,support vector machines have become one of the most important classification method in the field of pattern recognition,machine learning and data mining in recent years.How-ever,their training time becomes much longer with the increase of training instances.In the case of multi-class classification,the training process will become even more complex.To deal with above problems,a fast data reduction method named as MOIS is proposed for multi-class classification.With cluster centers being used as reference points,redundent instances can be deleted,bound instances crucial for the trainning can be selected,and the distribution imbalance between classes can also be relieved by the proposed method.Experiments show that MOIS can enormously improve the training efficiency while keeping or even improving the classification accuracy.For example,on Optdigit dataset,the classification accuracy is increased from 98.94%to 99.05%,while the training time is reduced to 0.15%of the original.What’s more,on the dataset formed by the first 100 classes of HCL2000,the training time of the proposed method is reduced to less than 6%of original,while the accuracy is improved slightly from 99.29%to 99.30%.Furthermore,MIOS is highly efficient.
作者 陈景年 CHEN Jing-nian(Department of Information and Computing Science,Shandong University of Finance and Economics,Jinan 250014,China)
出处 《计算机科学》 CSCD 北大核心 2022年第S01期297-300,共4页 Computer Science
基金 国家自然科学基金(61773325)。
关键词 支持向量机 多分类 数据约简 聚类 样本选择 Support vector machines Multi-class classification Data reduction Clustering Instance seletion
  • 相关文献

参考文献1

二级参考文献9

  • 1Hearst M.A., Dumais S.T., Osman E., Platt J., Scholkopf B.. Support vector machines. IEEE Intelligent Systems, 1998, 13(4): 18~28
  • 2Vapnik V.N.. An overview of statistical learning theory. IEEE Transactions on Neural Networks, 1999, 10(5): 988~999
  • 3Vapnik V.N.. Statistical Learning Theory.2nd ed..New York: Springer-Verlag, 1999
  • 4Müller Klaus-Robert, Mika Sebastian, Rtsch Gunnar, Tsuda Koji, Schlkopf Bernhard. An introduction to kernel-based learning algorithms. IEEE Transactions on Neural Networks, 2001, 12(2): 181~201
  • 5Burges C.J.C.. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 1998, 2(2): 121~167
  • 6Ke Hai-Xin,Zhang Xue -Gong.Editing support vector machines. In: Proceedings of the International Joint Conference on Neural Networks, Washington, DC, 2001, 2: 1464~1467
  • 7张学工.关于统计学习理论与支持向量机[J].自动化学报,2000,26(1):32-42. 被引量:2272
  • 8张鸿宾,孙广煜.近邻法参考样本集的最优选择[J].电子学报,2000,28(11):16-21. 被引量:8
  • 9李红莲,王春花,袁保宗.一种改进的支持向量机NN-SVM[J].计算机学报,2003,26(8):1015-1020. 被引量:71

共引文献52

同被引文献33

引证文献5

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部