摘要
提出一种更简洁的用于主要成分分析 (PCA)及其非线性分析的公式 .给出一个含有规则化项的原始权空间的约束最大优化问题 ,应用核技巧来解决其对偶问题 .该公式同最小二乘支持向量机 (LS SVM )分类器相似 .遵循常规的SVM方法 ,将输入空间的数据映射到高维特征空间 ,然后使用核技巧 ,利用主对偶约束最大优化来解释线性和非线性PCA分析问题 .其优点在于对偶问题适用于高维输入空间 ,而原始问题在N很大时能被更好地解决 .
A simple formulation for principal component analysis (PCA) and its kernel version was presented. A constrained optimization problem in primal weight space with incorporation of a regularization term was discussed. The dual problem was solved by using the kernel trick. The formulation was made in the same fashion as in the least squares support vector machine (LS SVM) classifiers. The data was mapped from the input space to a high dimensional feature by following the usual SVM methodology and the kernel trick was applied. A solution equivalent to kernel PCA can be obtained from the nonlinear version of the formulation. The dual problem is suit for the light dimensional input space and the primal problem can be solved better when N is bigger.
出处
《华中科技大学学报(自然科学版)》
EI
CAS
CSCD
北大核心
2005年第1期25-27,共3页
Journal of Huazhong University of Science and Technology(Natural Science Edition)
基金
国家"十五"重大科技专项基金资助项目 (2 0 0 1BA10 2A0 6 11) .
关键词
支持向量机
主要成分分析
核技巧
最小二乘支持向量机
support vector machine (SVM)
principal component analysis (PCA)
kernel trick
least squares-support vector machine (LS SVM)