期刊文献+

基于PSO算法的神经网络集成构造方法 被引量:18

PSO-based approach for neural network ensembles
下载PDF
导出
摘要 为合理选择组成神经网络集成的个体,使各个体间保持较大的差异度,从而提高集成所建模型的仿真精度,提出一种新的神经网络集成构造方法.独立训练出一批神经网络,采用离散粒子群优化(PSO)算法,用多维空间中0或1取值的粒子描述所有可能的神经网络集成.网络集成预测误差的估计值用组成集成的个体网络之间的相关度表示,并作为优化过程中的适应度函数.优选得到参与构成神经网络集成的部分差异度较大网络个体.对8个典型数据集回归问题的实验结果表明,该方法构造的神经网络集成普遍使用了较少的网络个体,而预测精度均好于Bagging方法等传统方法. To improve the prediction accuracy of model represented by artificial neural networks (ANN) ensemble, a new approach for constructing neural network ensemble was proposed. The basic idea of this novel approach was to optimally construct neural network ensemble with the aid of particle swarm optimizations (PSO). After a number of neural networks (NN) were properly trained, all possible NN ensembles were represented by particles in a multi-dimensional space, in which each dimension represented a particular NN, and the value 0 or 1 for each dimension meant whether or not the corresponding NN was to be included in the ensemble. Then discrete PSO algorithm was used to optimally select the ensembles. The results of the selection caused the optimized ensemble to include individual NN members with higher diversity. The prediction error of the model expressed by ensemble was determined by the degree of correlation between NN components, which was also used as the fitness function. Empirical studies on regression upon eight typical data sets show that this approach yields ensemble with significantly smaller size, while achieving much better performance than other traditional ones such as Bagging method.
出处 《浙江大学学报(工学版)》 EI CAS CSCD 北大核心 2004年第12期1596-1600,共5页 Journal of Zhejiang University:Engineering Science
关键词 神经网络 集成 粒子群优化(PSO) Artificial intelligence Computer simulation Integration Learning algorithms Regression analysis
  • 相关文献

参考文献11

  • 1HANSEN L K, SALAMON P. Neural network ensembles[J]. IEEE Transaction on Pattern Analysis and Machine Intelligence, 1990,12(10): 9931001.
  • 2SCHAPIRE R E. The strength of weak learn ability[J]. Machine Learning, 1990,5(2): 197227.
  • 3BREIMAN L. Bagging predictors [J]. Machine Learning, 1996, 24(2): 123140.
  • 4ZHOU Z H,WUJX,JIANG Y, et al. Genetic algorithm based selective neural network ensemble[A]. Proceedings the 17th International Joint Conference on Artificial Intelligence [C]. Seattle,WA: [s.n.],2001,2: 797802.
  • 5PERRON M P, COOPER L N. When networks disagree: Ensemble method for neural networks[A]. Artificial Neural Networks: Theory and Applications [C]. San Diego, CA: Academic Press,1991: 8196.
  • 6OPTIZ D,SHAVILIK J. Actively searching for an effective neural network ensemble[J]. Connection Science,1996,8(3-4): 337353.
  • 7SOLLICH P,KROGH A. Learning with ensembles: How over-fitting can be useful[A]. Advances in Neural Information Processing Systems[C]. Cambridge,MA: MIT,1996,8: 190196.
  • 8KROGH A,VEDELSDY J. Neural network ensembles, cross validation and active learning[A]. Advances in Neural Information Processing Systems[C]. Cambridge,MA: MIT,1995, 7: 231238.
  • 9KENNEDY J,EBERHART R. Particle swarm optimization [A]. Proceedings IEEE International Conference on Neural Network [C]. Piscataway,NJ:IEEE, 1995: 19421948.
  • 10KENNEDY J,EBERHART R. A discrete binary version of the particle swarm optimization[A]. Proceedings IEEE International Conference on Computational Cybernetics and Simulation[C]. Piscataway,NJ: IEEE,1997: 41044108.

同被引文献199

引证文献18

二级引证文献113

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部