摘要
特征选择通过移除不相关和冗余的特征来提高学习算法的性能。基于进化算法在求解优化问题时表现出的优越性能,提出FSSAC特征选择方法。新的初始化策略和评估函数使得SAC能将特征选择作为离散空间搜索问题来解决,利用特征子集的准确率指导SAC的采样阶段。在实验阶段,FSSAC结合SVM,J48和KNN分类器,通过UCI数据集完成验证,并与FSFOA,HGAFS,PSO等算法进行了比较。实验结果表明,FSSAC可以提高分类器的分类准确率,且具有良好的泛化性能。除此之外,对FSSAC和其他算法在特征空间维度缩减情况方面做了对比。
Feature selection can improve the performance of learning algorithm with the help of removing the irrelevant and redundant features.As evolutionary algorithm is reported to be suitable for optimization tasks,this paper proposed a new feature selection algorithm FSSAC.The new initialization strategy and evaluation function make FSSAC regard feature selection as a discrete space search problem.The algorithm also uses the accuracy of feature subset to guide the sampling period.In the stage of experiment,FSSAC was combined with the SVM,J48 and KNN,and then it was validated on UCI machine learning datasets by comparing with FSFOA,HGAFS,PSO and so on.The experiments show that FSSAC can improve the classification accuracy of classifier and has good generalization.Besides,FSSAC was also compared with other available methods in dimensionality reduction.
出处
《计算机科学》
CSCD
北大核心
2018年第2期63-68,共6页
Computer Science
基金
吉林省科技发展计划项目(20140101200JC)资助