摘要
为了解决现有特征选择算法没有同时考虑特征之间以及特征与类别之间的相关性,且存在计算量大、适用范围窄等问题,从均方误差最小的分类训练准则出发,并借鉴线性鉴别分析的思想,提出了一种类内方差与相关度结合的特征选择算法,并使用核方法将其推广到可以解决非线性分类的特征选择问题.该算法不仅同时考虑了样本特征之间以及特征与分类标号之间的相关性,而且使得类内方差最小,有效地提高了分类器的性能.仿真实验表明:该算法适用于对特征数量多、特征相关性强的数据集进行特征选择,其选择的特征子集能够显著提高分类精度,具有较大的优越性.
To solve the problems of large calculation and narrow application scope of present feature selection algorithms which do not take the correlation among sample features and that between sample features and classification labels into account, a feature selection algorithm combining within-class variance with correlation measure was proposed based on the principle of the minimum mean square error for classifier and the idea of linear discrimination analysis, then the algorithm was expanded to solve nonlinear feature selection problems by using kernel technology. The proposed algorithm can not only consider both the correlation among the features and that between the features and classification labels, but also minimize the within-class variance, effectively improving the generalization performance of classifier. The experimental results show that the proposed algorithm can be used to select important features from these data sets with numbers of correlation characteristics, and effectively improves the classification accuracy.
出处
《哈尔滨工业大学学报》
EI
CAS
CSCD
北大核心
2011年第3期132-136,共5页
Journal of Harbin Institute of Technology
基金
现代焊接生产技术国家重点实验室开放课题研究基金资助
江苏省高技术研究资助项目(BG2007013)
关键词
类内方差
相关度
特征选择
LDA
分类
within-class variance
correlation measure
feature selection
LDA
classification