摘要
针对大规模数据集的回归和分类问题,改进了最小二乘支持向量机.以再生核希尔伯特空间中的线性分析为基础,把样本集映射到再生空间中,然后张成再生空间的一个线性子空间,并求出这个子空间的基.利用基线性表示子空间中的其他元素,减小了求解矩阵的维数,通过求解规模相对较小的线性方程组完成对支持向量机的训练.采用该方法对较大规模的数据样本进行了回归和分类仿真试验,并与普通的最小二乘支持向量机进行比较.结果表明,采用该方法解决复杂非线性函数的回归和分类问题,不但可以得到稀疏解,而且计算速度比普通最小二乘支持向量机提高了约20%.
Sparse least squares support vector machine (SLS-SVM) was proposed to resolve the problems of regression and classification of large sample dataset. The samples were mapped into reproducing kernel Hilbert space (RKHS) and a subspace was spanned there. Then the basis of the subspace was found, which could represent all the samples linearly. The SLS-SVM was obtained by solving a small equations set. Two numerical examples illustrate that the approach can fit nonlinear models and classify complex samples for large dataset. Compared with classical least squares support vector machine, this method can find sparse solutions without any pruning or surgeon, and computing speed is much faster because the final result can be obtained by solving a comparatively small-scale equations set.
出处
《浙江大学学报(工学版)》
EI
CAS
CSCD
北大核心
2007年第2期245-248,共4页
Journal of Zhejiang University:Engineering Science
基金
浙江省科技计划重点资助项目(2005C21087)
关键词
最小二乘支持向量机
再生核希尔伯特空间
径向基函数
least squares support vector machine
reproducing kernel Hilbert space
radial basis function