期刊文献+

改进混合二进制蝗虫优化特征选择算法 被引量:3

Improved Shuffled Binary Grasshopper Optimization Feature Selection Algorithm
下载PDF
导出
摘要 特征选择是从数据集的原始特征中选出最优或较优特征子集,从而在加快分类速度的同时提高分类准确率。提出了一种改进的混合二进制蝗虫优化特征选择算法:通过引入步长引导个体位置变化的二进制转化策略,降低了进制转换的盲目性,提高了算法在解空间中的搜索性能;通过引入混合复杂进化方法,将蝗虫群体划分子群并独立进化,提高了算法的多样性,降低了早熟收敛的概率。采用改进算法对UCI部分数据集进行特征选择,使用K-NN分类器对特征子集进行分类评价,实验结果表明:与基本二进制蝗虫优化算法、二进制粒子群优化算法和二进制灰狼优化算法相比,改进算法具有较优的搜索性能、收敛性能与较强的鲁棒性,能够获得更好的特征子集,取得更好的分类效果。 Feature selection is to select the optimal or relatively optimal feature subsets from the original feature set of the data set to speed up classification and improve classification accuracy.An improved shuffled binary grasshopper optimization feature selection algorithm is proposed in this paper.By introducing a binary transformation strategy that uses step size to guide individual position change,the blindness of the binary conversion is reduced,and the search performance of the algorithm in solution space is improved.By introducing shuffled complex evolution,the grasshopper population is divided into subgroups and evolved independently,which improves the diversity of algorithm and reduces the probability of premature convergence.The improved algorithm is used to select features of some data sets of UCI,and K-NN(K-nearest neighbor)classifier is used to classify and evaluate the feature subset.Experimental results show that compared with the basic binary grasshopper optimization algorithm,binary particle swarm optimization algorithm and binary gray wolf optimization algorithm,the improved algorithm has better search performance,convergence performance and strong robustness,and can obtain better feature subsets and better classification effect.
作者 赵泽渊 代永强 ZHAO Zeyuan;DAI Yongqiang(College of Information Science and Technology,Gansu Agricultural University,Lanzhou 730070,China)
出处 《计算机科学与探索》 CSCD 北大核心 2021年第7期1339-1349,共11页 Journal of Frontiers of Computer Science and Technology
基金 甘肃农业大学青年导师基金(GAU-QDFC-2019-02) 甘肃省高等学校创新能力提升项目(2019A-056) 甘肃农业大学学科建设专项基金(GAU-XKJS-2018-253) 国家自然科学基金(61063028,61751313)。
关键词 二进制 蝗虫优化算法 混合复杂进化方法 特征选择 分类 K邻近(K-NN)算法 binary grasshopper optimization algorithm shuffled complex evolution feature selection classification K-nearest neighbor(K-NN)algorithm
  • 相关文献

参考文献8

二级参考文献90

  • 1陈涛,谢阳群.文本分类中的特征降维方法综述[J].情报学报,2005,24(6):690-695. 被引量:79
  • 2HSING T, LIU L-Y, MARCEL B, et al. The coefficient of intrinsic dependence (feature selection using el CID) [J]. Pattern Recognition, 2005, 38(5) : 623 -36.
  • 3QINGHUA W, YOUYUN Z, LEI C, et al. Fault diagnosis for diesel valve trains based on non-negative matrix factorization and neural network ensemble [ J]. Mechanical Systems and Signal Processing, 2009, 23(5): 1683 -95.
  • 4BEHRENS T, ZHU A X, SCHMIDT K, et al. Multi- scale digital terrain analysis and feature selection for dig- ital soil mapping [ J ]. Geoderma, 2010, 155 ( 3 - 4) : 175 - 85.
  • 5CAMACHO J, PIC J, FERRER A. Data understanding with PCA: Structural and Variance Information plots I J]. Chemometrics and Intelligent Laboratory Systems, 2010, 100(1) : 48 -56.
  • 6LIPOVETSKY S. PCA and SVD with nonnegative loadings [ J ]. Pattern Recognition, 2009, 42 ( 1 ) : 68 - 76.
  • 7RADULOVIC J, RANKOVIC V. Feedforward neural network and adaptive network-based fuzzy inference system in study of power lines [ J ]. Expert Systems with Applications, 2010, 37(1): 165-70.
  • 8KHOSRAVI A, NAHAVANDI S, CREIGHTON D. A prediction interval-based approach to determine optimal structures of neural network metamodels [ J ]. Expert Systems with Applications, 2010, 37 (3) : 2377 - 87.
  • 9LPEZMM, RAM REZ J, G RRIZ J M, et al. SVM- based CAD system for early detection of the Alzheimer's disease using kernel PCA and LDA [ J ]. Neuroscience Letters, 2009, 464(3) : 233 -8.
  • 10AMJADY N, KEYNIA F. Day-ahead price forecasting of electricity markets by a new feature selection algorithm and cascaded neural network technique [ J ]. Energy Conversion and Management, 2009, 50(12) : 2976 -82.

共引文献123

同被引文献38

引证文献3

二级引证文献8

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部