期刊文献+

基于多层学习克隆选择的改进式增量型超限学习机算法 被引量:1

Improved incremental extreme learning machine based on multi-learning clonal selection algorithm
下载PDF
导出
摘要 针对增量型超限学习机(incremental extreme learning machine,I-ELM)中大量冗余节点可导致算法学习效率降低,网络结构复杂化等问题,提出基于多层学习(multi-learning)优化克隆选择算法(clone selection algorithm,CSA)的改进式I-ELM.利用Baldwinian learning操作改变抗体信息的搜索范围,结合Lamarckian learning操作提高CSA的搜索能力.改进后的算法能够有效控制I-ELM的隐含层节点数,使网络结构更加紧凑,提高算法精度.仿真结果表明,所提出的基于多层学习克隆选择的增量型核超限学习机(multi-learning clonal selection I-ELMK,MLCSIELMK)算法能够有效简化网络结构,并保持较好的泛化能力,较强的学习能力和在线预测能力. The great number of redundant nodes in an incremental extreme learning machine(I-ELM) may lower the learning efficiency of the algorithm,and complicate the network structure.To deal with this problem,we propose the improved I-ELM with kernel(I-ELMK) on the basis of multi-learning clonal selection algorithm(MLCSA).The MLCSA uses Baldwinian learning and Lamarckian learning,to exploit the search space by employing the information of antibodies,and reinforce the exploitation capacity of individual information.The proposed algorithm can limit the number of hidden layer neurons effectively to obtain more compact network architecture.The simulations show that MLCSI-ELMK has higher prediction accuracies online and off-line,while providing a better capacity of generalization compared with other algorithms.
出处 《控制理论与应用》 EI CAS CSCD 北大核心 2016年第3期368-379,共12页 Control Theory & Applications
基金 国家自然科学基金项目(61102124) 辽宁省科学技术计划项目(JH2/101)资助~~
关键词 克隆选择算法 鲍德温学习 拉马克学习 神经网络 增量型超限学习机 软计算 clonal selection algorithm Baldwinian learning Lamarckian learning neural networks incremental extreme learning machine soft computing
  • 相关文献

参考文献30

  • 1HUANG G B, ZHU Q, SIEW C K. Extreme learning machine: theory and applications [J]. Neurocomputing, 2006, 70 (1/2/3): 489 - 501.
  • 2刘学艺,宋春跃,李平.基于Vapnik-Chervonenkis泛化界的极限学习机模型复杂性控制[J].控制理论与应用,2014,31(5):644-653. 被引量:4
  • 3ALEXANDRE E, CUADRA L, SALCEDO-SANZ S, et al. Hybridiz- ing extreme learning machines and genetic algorithms to select a- coustic features in vehicle classification [J]. Neurocomputing, 2015, 152(25): 58 - 68.
  • 4KONG W W, ZHANG C, LIU F, et al. Irradiation dose detection of irradiated milk powder using visible and near-infrared spectroscopy and chemometrics [J]. Journal of Dairy Science, 2015, 96(8): 4921 - 4927.
  • 5SUN L, CHEN B D, TOH K A, et al. Sequential extreme learning machine incorporating survival error potential [J]. Neuroeomputing, 2015, 155(1): 194- 204.
  • 6BAZI Y, ALAJLAN N, MELGANI E et al. Differential evolution extreme learning machine for the classification of hyperspectral im- ages [J]. IEEE Geoscience and Remote Sensing Letters, 2014, 11(6): 1066 - 1070.
  • 7BENCHERIF M A, BAZI Y, GUESSOUM A, et al. Fusion of ex- treme learning machine and graph-based optimization methods for active classification of remote sensing images [J]. IEEE Geoscience and Remote Sensing Letters, 2015, 12(3): 527 - 531.
  • 8TANG J X, DENG C W, HUANG G B, et al. Compressed-domain ship detection on spaceborne optical image using deep neural network and extreme learning machine [J]. 1EEE Transactions on Geoscience andRemote Sensing, 2015, 53(3): 1174- 1185.
  • 9CHANG Y Q, WANG S, TIAN H X, et al. Multiple regression ma- chine system based on ensemble extreme learning machine for soft sensor [J]. Sensor Letters, 2013, 11(4): 710- 714.
  • 10贺彦林,王晓,朱群雄.基于主成分分析-改进的极限学习机方法的精对苯二甲酸醋酸含量软测量[J].控制理论与应用,2015,32(1):80-85. 被引量:26

二级参考文献61

  • 1罗印升,李人厚,张维玺.基于免疫机理的动态函数优化算法[J].西安交通大学学报,2005,39(4):384-388. 被引量:6
  • 2JIN Y C, BRANKE J. Evolutionary optimization in uncertain environments-A survey[C] I/Proceedings of IEEE Congress on Evo- lutionary Computation. New York: IEEE, 2005, 9(3): 303 - 317.
  • 3MORRISON R W, JONG DE K A. Triggered hypermutation revis- ited[C] I/Proceedings of 1EEE Congress on Evolutionary Computa- tion. Piseataway: IEEE, 2000:1025 - 1032.
  • 4GOLDBERG D E, SMITH R E. Nonstationary function optimiza- tion using genetic algorithms with dominance and diploidy[C]//Pro- ceedings of the 2nd International Conference on Genetic Algorithms. Hillsdale, NJ: Lawrence Erlbaum Associates, 1987:59 - 68.
  • 5SIMOES A, COSTA E. Using GAs to deal with dynamic environ- ments: a comparative study of several approaches based on promot-ing diversity[C] I/Proceedings of the Genetic and Evolutionary Com- putation Conference. San Francisco: Morgan Kaufmann Publishers, 2002:698 - 702.
  • 6MORI N, KITA H, NISHIKAWA Y. Adaptation to a changing en- vironment by means of the thermo-dynamical genetic algorithmiC] I/Parallel Problem Solving from Nature. Berlin: Springer Publishers, 1996:513 - 522.
  • 7YANG S. Non-stationary problem optimization using the primal-dual genetic algorithmiC] l/Proceedings of the 2003 Congress on Evolu- tionary Computation. Piscataway: IEEE, 2003:2246 - 2253.
  • 8BRANKE J. Memory enhanced evolutionary algorithms for chang- ing optimization problems[C]//Proceedings of the 1999 Congress on Evolutionary Computation. Piscataway: IEEE, 1999: 1875- 1882.
  • 9BENDTSEN C N, KRINK T. Dynamic memory model for non- stationary optimization[C] I/Proceedings of the 2002 Congress on Evolutionary Computatoin. Piscataway: IEEE, 2002:145 - 150.
  • 10BRANKE J, KAUBLER T, SCHM1DT C. A multi-population ap- proach to dynamic optimization problems[M]//Adaptive Computing in Design and Manufacturing. Berlin: Springer-Verlag, 2000:299 - 308.

共引文献30

同被引文献6

引证文献1

二级引证文献7

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部