摘要
给出了使用多支持向量机进行增量学习的算法.传统的支持向量机不具有增量学习性能,而常用的增量学习方法各具有不同的优缺点,基于固定划分和过间隔技术,提出了使用多支持向量机进行增量学习的算法;使用此算法,针对标准数据集BUPA及用NDC生成的数据集OUTTRAIN进行了实验,结果表明,使用单一的支持向量机进行增量学习,不论采用过间隔还是固定划分技术,其增量学习的正确率不及使用多支持向量机增量学习算法的正确率.
An incremental learning algorithm using multiple support vector machines (SVMs) is proposed. There is no incremental learning ability for the traditional support vector machine and there are all kinds of merits and flaws for usually used incremental learning method. Based on fixed partition and exceeding margin technique, an incremental learning algorithm using multiple support vector machines is presented. Applied the algorithm to both standard dataset BUPA and dataset OUTTRAIN generated by NDC, experiments show that the accuracy using a single support vector machine, whether it is fixed partition or exceeding partition, is inferior to one using multiple support vector machines.
出处
《北方交通大学学报》
CSCD
北大核心
2003年第5期34-37,共4页
Journal of Northern Jiaotong University
基金
国家科技攻关计划项目(2002BA407B)
关键词
支持向量机
增量学习
期望风险
固定划分
过间隔
support vector machine (SVM)
incremental learning
forecasting risk
fixed partition
exceeding margin