期刊文献+

拉格朗日支持向量回归的有限牛顿算法 被引量:3

Finite Newton algorithm for Lagrangian support vector regression
下载PDF
导出
摘要 拉格朗日支持向量回归是一种有效的快速回归算法,求解时需要对维数等于样本数加一的矩阵求逆,求解需要较多的迭代次数才能收敛。采用一种Armijo步长有限牛顿迭代算法求解拉格朗日支持向量回归的优化问题,只需有限次求解一组线性等式而不需要求解二次规划问题,该方法具有全局收敛和有限步终止的性质。在多个标准数据集上的实验验证了所提算法的有效性和快速性。 Lagrangian Support Vector Regression (SVR) is an effective algorithm and its solution is obtained by taking the inverse of a matrix of order equaling the number of samples plus one, but needs many times to terminate from a starting point. This paper proposed a finite Armijo-Newton algorithm solving the Lagrangian SVR's optimization problem. A solution was obtained by solving a system of linear equations at a finite number of times rather than solving a quadratic programming problem. The proposed method has the advantage that the result optimization problem is solved with global convergence and finite-step termination. The experimental results on several benchmark datasets indicate that the proposed algorithm is fast, and shows good generalization performance.
出处 《计算机应用》 CSCD 北大核心 2012年第9期2504-2507,共4页 journal of Computer Applications
基金 国家自然科学基金资助项目(60775011)
关键词 支持向量回归 拉格朗日支持向量机 有限牛顿算法 迭代算法 Support Vector Regression (SVR) Lagrangian support vector machine finite Newton algorithm iterative algorithm
  • 相关文献

参考文献19

  • 1CRISTIANINI N, SHAWE-TAYLOR J. Kernel methods for pattern analysis[ M]. Cambridge: Cambridge University Press; 2004.
  • 2BURGES C J C. A tutorial on support vector machines for pattern reeognition[ J]. Data Mining Knowledge Discovery, 1998, 2 (2) : 121 - 167.
  • 3SCHOLKOPF B, SMOLA A J. Learning with kernels[ M]. Cam- bridge: MIT Press, 2002.
  • 4VAPNIK V. The nature of statistical learning theory[ M]. Berlin: Springer-Verlag, 1995.
  • 5VAPNIK V. An overview of statistical learning theory[ J]. IEEE Transactions on Neural Networks, 1999, 10(5):988 -999.
  • 6CORTES C, VAPNIK V. Support vector networks[ J]. Maching Learning, 1995, 20(3) : 273 -297.
  • 7CRISTIANINI N, SHAWE-TAYLOR J. An Introduction to support vector machines and other kernel-based learning methods[ M].北京:机械工业出版社,2005.
  • 8MULLER L R, MIKA S, RATSCH G, et al. An introduction to ker- nel-based learning[ J]. IEEE Transactions on Neural Networks, 2001, 12(2): 181 -201.
  • 9JOHAN A K S, TONY V G, JOS D B, et al. Least squares support vector machines[ M]. [ S. 1. ] : World Scientific Press, 2002.
  • 10PENG X J. TSVR: An efficient twin support vector machine for re- gression[ J]. Neural Networks, 2010, 23(3) : 365 - 342.

同被引文献49

  • 1王兴玲,李占斌.基于网格搜索的支持向量机核函数参数的确定[J].中国海洋大学学报(自然科学版),2005,35(5):859-862. 被引量:123
  • 2彭丽芳,孟志青,姜华,田密.基于时间序列的支持向量机在股票预测中的应用[J].计算技术与自动化,2006,25(3):88-91. 被引量:31
  • 3仲东亭,张玥.BP神经网络对烟草销售量预测方法的改进研究[J].工业技术经济,2007,26(9):115-118. 被引量:11
  • 4BAUMBERG A. Reliable feature matching across widely separated views [ C ]//Proc. Int. Conf. Computer Vision and Pattern Recognition. Hilton Head, USA: IEEE Com- puter Society, 2000: 774-781.
  • 5LINDEBERG T. Feature detection with automatic scale selection[ J]. International Journal of Computer Vision, 1998, 30 (2) : 79-116.
  • 6MIKOLAJCZYK K, SCHMID C. Indexing Based on Scale Invariant Interest Points [ C ]//Proc. Int. Conf. Computer Vision Vancouver. Vancouver, BC, Canada:IEEE Com- puter Society, 2001:525-531.
  • 7MIKOLAJCZYK K, SCHMID C. An Affine Invariant In- terest Point Detector[ C ]//Proc. Seventh European Conf. Computer Vision Copenhagen. Copenhagen, Denmark : Springer-Verlag Berlin Heidelberg, 2002: 128-142.
  • 8MATAS J, CHUM O, URBAN M,et al. Robust Wide Baseline Stereo from Maximally Stable Extremal Regions [C]//Proc. 13th British Machine Vision. Cardiff Uni- versity, British: British Machine Vision Association and Society, 2002:384-393.
  • 9MIKOLAJCZYK K, SCHMID C. A Performance Evalua- tion of Local Descriptors [ J ]. IEEE Trans Pattern Anal Mach Intell, 2005, 27(10) :1615-1629.
  • 10HARTLEY R, ZISSERMAN A. Multiple View Geometry in Computer Vision [ M ]. Cambridge, UK : Cambridge University Press, 2000:25-27.

引证文献3

二级引证文献11

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部