期刊文献+

坐标下降l_2范数LS-SVM分类算法 被引量:1

Classification Algorithm of l_2-norm LS-SVM via Coordinate Descent
下载PDF
导出
摘要 研究l2范数正则化最小二乘支持向量机的坐标下降算法实现.在图像处理、人类基因组分析、信息检索、数据管理和数据挖掘中经常会遇到机器学习目标函数要处理的数据无法在内存中处理的场景.最近研究表明大规模线性支持向量机使用坐标下降方法具有较好的分类性能,在此工作基础上,文中扩展坐标下降方法到最小二乘支持向量机上,提出坐标下降l2范数LS-SVM分类算法.该算法把LS-SVM目标函数中模型向量的优化问题简化为特征分量的单目标逐次优化问题.在高维小样本数据集、中等规模数据集和大样本数据集上的实验验证了该算法的有效性,与LS-SVM分类算法相比,在数据内存中无法处理的情况下可作为备用方法. The coordinate descent approach for l2 norm regulated least square support vector machine is studied. The datasets involved in the objective function for machine learning have larger data scale than the memory size has in image processing, human genome analysis, information retrieval, data management, and data mining. Recently, the coordinate descent method for large-scale linear SVM has good classification performance on large scale datasets. In this paper, the results of the work are extended to the least square support vector machine, and the least square support vector machine is proposed. function is reduced to single variable optimization by high-dimension small-sample datasets, middle-scale coordinate descent approach for l2 norm regulated The vector optimization of the LS-SVM objective the proposed algorithm. The experimental results on datasets and large-scale datasets demonstrate its effectiveness. Compared to the state-of-the-art LS-SVM classifiers, the proposed method can be a good candidate when data cannot fit in memory.
出处 《模式识别与人工智能》 EI CSCD 北大核心 2013年第5期474-480,共7页 Pattern Recognition and Artificial Intelligence
基金 国家国家重点基础研究发展计划项目(No.2012CB720500) 国家重点基础研究发展计划项目(No.2012CB720500)资助
关键词 l2范数正则化 最小二乘支持向量机 坐标下降 大规模数据集 l2 Norm Regularization, Least Square Support Vector Machine, Coordinate Descent, LargeScale Datasets
  • 相关文献

参考文献16

  • 1Boser B E, Guyon I M, Vapnik V N. A Training Algorithm for Opti- mal Margin Classifiers// Proc of the 5th Annual ACM Conference on Computational Learning Theory. Pittsburgh, USA, 1992: 144-152.
  • 2Suykens J A K, Van Gestel T, de Brabanter J, et al. Least Squares Support Vector Machines. Singapore: World Scientific, 2002.
  • 3Suykens J A K, Vandewalle J. Least Squares Support Vector Machine Classifiers. Neural Processing Letters, 1999, 9(3 ) : 293- 300.
  • 4Lazaro J L, Dorronsoro J R. Least 1 -Norm SVMs: A New SVM Vari- ant between Standard and LS-SVMs// Proc of the 18th European Symposium on Artificial Neural Networks. Bruges, Belgium, 2010: 135-140.
  • 5Suykens J A K, de Brabanter J, Lukas L, et al. Weighted Least Squares Support Vector Machines: Robustness and Sparse Approxi- mation. Neurocomputing, 2002, 48( 1 ) : 85-105.
  • 6Valyon J, Horv~tth G. A Weighted Generalized LS-SVM. Periodica Polytechnica Electrical Engineering, 2003, 47 (3/4) : 229-251.
  • 7LESKI J M. Iteratively Reweighted Least Squares Classifier and Its l2-and l1 -Regularized Kernel Versions. Bulletin of The Pohsh Acad- emy of Sciences: Technical Sciences, 2010, 58( 1 ) : 171-182.
  • 8Liu Jingli, Li Jianping, Xu Weixuan, et al. A Weighted Lq Adap- tive Least Squares Support Vector Machine Classifiers-Robust and Sparse Approximation. Export Systems with Applications, 2011, 38 (3) : 2253-2259.
  • 9Chang K W, Hsieh C J, Lin C J. Coordinate Descent Method for Large-Scale lz-Loss Linear SVM. Journal of Machine Learning Research, 2008, 9:1369-1398.
  • 10Fan R E, Chang K W, Hsieh C J, et al. LIBLINEAR: A Library for Large Linear Classification. Journal of Machine Learning Research, 2008, 9:1871-1874.

同被引文献4

引证文献1

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部