摘要
支持向量机(SVM)是在统计学习理论基础上发展起来的新方法,其训练算法本质上是一个二次规划的求解问题.首先简要概述了SVM的基本原理,然后对SVM训练算法的国内外研究现状进行综述,重点分析SVM的缩减算法和具有线性收敛性质的算法,对这些算法的性能进行比较,并且对SVM的扩展算法也进行简单介绍.最后对该领域存在的问题和发展趋势进行了展望.
Support vector machines (SVMs) use new methods that originated in statistical learning theory. Training of an SVM can be formulated as a quadratic programming problem. The principles of SVM have been summarized briefly in this paper. The latest developments in SVM training algorithms in domestic and overseas research were reviewed, especially reduction algorithms and algorithms with linear convergence properties. The performance of these algorithms was then compared, and a brief introduction to a proposed extension of them was given. Finally some problems and potential directions for future research are discussed.
出处
《智能系统学报》
2008年第6期467-475,共9页
CAAI Transactions on Intelligent Systems
基金
国家自然科学基金资助项目(60474069)
关键词
统计学习理论
支持向量机
训练算法
statistical learning theory
support vector machine
training algorithms