摘要
通过等式约束条件修改普通的支持向量机可以得到最小二乘支持向量机,不需要再次求解复杂的二次规划问题.提出了利用核主元分析进行特征提取,在高维特征空间中计算主元,降低样本的维数,然后用最小二乘支持向量机进行建模.仿真结果表明了该方法的有效性和优越性.
The standard support vector machines (SVM) formulation is modified by considering equality constraints within a form of ridge regression instead of inequality constraints. The solution can be obtainsed from solving a set of linear equations instead of a quadratic programming problem. The kernel principal component analysis (KPCA) is applied to least squares support vector machines (LSSVM) for feature extraction. KPCA calculates principal component in high dimensional feature space. The way reduces dimensions of sample and regression is applied with the LSSVM. Simulation results show that the method proposed is effective and superior.
出处
《控制与决策》
EI
CSCD
北大核心
2006年第9期1073-1076,共4页
Control and Decision
基金
国家科技部攻关基金项目(2003EG113016)
北京市教委重点学科共建基金项目
关键词
核的主元分析
最小二乘支持向量机
主元
特征提取
Kernel principal component analysis
Least squares support vector machines
Principal component
Feature extraction