期刊文献+

一种加权线性损失支持向量机

A WEIGHTED LINEAR LOSS SUPPORT VECTOR MACHINE
下载PDF
导出
摘要 针对传统支持向量机无法适应大规模问题,通过引入加权线性损失函数,取代标准支持向量机的Hinge损失,提出一种加权线性损失支持向量机WLSVM(Weighted Linear Loss Support Vector Machine)。它的主要方法是:(1)通过对线性损失增加权重,提出对不同位置上的训练点给出不同惩罚,在一定程度上避免了过度拟合,增强了泛化能力。(2)仅需计算非常简单的数学表达式就可获得分类超平面,且方便解决大规模问题。通过在合成和真实数据集上的试验,结果表明:WLSVM的分类精度高于SVM和LSSVM,且减少了计算时间,尤其对于大规模问题。 In view of that traditional support vector machine( SVM) cannot adapt to large-scale problems,by introducing the weighted linear loss function to replace the Hinge loss function in standard support vector machine,the paper proposes a weighted linear loss support vector machine( WLSVM). Its main features are:( 1) by adding the weight on linear loss,it suggests giving different penalties on training points at different positions,thus avoids over fitting to a certain extent and enhances the generalisation ability.( 2) it can obtain the classified hyperplane by just calculating every simple mathematical expression,and is convenient in solving large-scale problems. Through the experiments on both synthetic and real datasets it is shown that the classification accuracy of WLSVM is higher than that of SVM and LS-SVM,and the computation time is reduced as well,especially for large-scale problems.
出处 《计算机应用与软件》 CSCD 2015年第12期114-117,共4页 Computer Applications and Software
基金 国家自然科学基金项目(61272500)
关键词 模式识别 支持向量机 线性损失 加权因子 大规模问题 Pattern recognition Support vector machine Linear loss Weighed coefficient Large-scale problems
  • 相关文献

参考文献19

  • 1Chen P,Zhang D.Constructing Support Vector Machines Ensemble Classification Method for Imbalanced Datasets Based on Fuzzy Integral[M]//Modern Advances in Applied Intelligence.Springer International Publishing,2014:70-76.
  • 2Deng N,Tian Y,Zhang C.Support Vector Machines:Theory,Algorithms,and Extensions[M].CRC Press,2012.
  • 3任剑锋,梁雪,李淑红.基于非线性流形学习和支持向量机的文本分类算法[J].计算机科学,2012,39(1):261-263. 被引量:10
  • 4孙磊,陈阳,黄洋文,欧琳,苏颖,冯尚源,雷晋萍.支持向量机算法对鼻咽癌与正常鼻咽细胞株拉曼光谱分析[J].光谱学与光谱分析,2013,33(6):1566-1569. 被引量:4
  • 5陈荣达,虞欢欢.基于启发式算法的支持向量机选股模型[J].系统工程,2014,32(2):40-48. 被引量:9
  • 6Vapnik V.The nature of statistical learning theory[M].Springer-Verlag New York Inc,2000.
  • 7Platt J.Fast training of support vector machines using sequential minimal optimization[M].Advances in kernel methods-support vector learning.Cambridge,MA:MIT Press,1999:185-2008.
  • 8Joachims T.Making large-scale svm learning practical[M]//Advances in Kernel Methods&mdash,Support Vector Learning.MIT Press,1999:169-184.
  • 9Chang C C,Lin C J.LIBSVM:A library for support vector machines[J].ACM Transactions on Intelligent System and Technology(TIST),2011,2(3):1-27.
  • 10Fan R E,Chang K W,Hsieh C J,et al.LIBLINEAR:a library for large linear classification[J].Journal of Machine Learning Research,2008,9(8):1871-1874.

二级参考文献29

  • 1胡佳妮,徐蔚然,郭军,邓伟洪.中文文本分类中的特征选择算法研究[J].光通信研究,2005(3):44-46. 被引量:47
  • 2常建娥,蒋太立.层次分析法确定权重的研究[J].武汉理工大学学报(信息与管理工程版),2007,29(1):153-156. 被引量:602
  • 3Carl Deng, He Xiao-fei, Han Jia-wei. Document Clustering Using Locality Preserving Indexing [J]. IEEE Transactions on Know- ledge and Data Engineering,2005,17(12) :1624-1637.
  • 4Parkin D M, Whelan S L, Ferlay J, et al. IARC Scientific Publication, France Lyons. 2003, 155, Lyon.
  • 5Puppels G J, de Mul F F M, Otto C, et al. Nature, 1990, 347(6290): 301.
  • 6Feng S, Chen R, Lin J, et al. Biosens Bioelectron, 2010, 25(11): 2414.
  • 7Feng S, Chen R, LinJ, et al. Biosens Bioelectron, 2011, 26(7): 3167.
  • 8Schmid U, Roesch P, Krause M, et al. Chemometr Intell. Lab. , 2009, 96(2): 159.
  • 9Bensalah K, Fleureau J, Rolland D, et al. Eur. Urol. , 2010, 58(4): 602.
  • 10I.IU You, HUANG Li-qing, WANG Jun, et al. Spectroscopy and Spectral Analysis, 2012, 32(2): 386.

共引文献20

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部