期刊文献+

基于蚁群算法的选择性神经网络集成方法 被引量:7

Ant conlony optimization-based approach for selective neural network ensemble
下载PDF
导出
摘要 为选择差异度较大、精确度较高的神经网络个体组建神经网络集成,提高神经网络集成的性能,提出一种新的选择性神经网络集成构造方法.该算法采用蚁群优化算法在独立训练的神经网络个体中选择部分组建网络集成,在蚁群优化过程中神经网络个体被选择的概率由信息素和启发因子决定,信息素反映当前神经网络个体的精确度,启发因子反映神经网络个体间的差异度,能有效提高系统的搜索效率和预测精度.实验结果表明,该算法构造的神经网络集成使用了较少的网络个体,而预测误差均好于传统的Bagging和Boosting算法. A new approach was presented to improve the performance of selective neural network ensemble by choosing the appropriate individuals that are accurate and diverse from candidate neural networks. Ant colony optimization algorithm was employed in which the selective probability depends on the pheromone and heuristic information. The pheromone is re-specified according to the accuracy of individuals while heuristic information indicates the diversity of individuals. The experiments on typical date sets show that this approach yields ensemble with smaller size while achieving much better performance, compared to the traditional Bagging and Boosting algorithm.
出处 《浙江大学学报(工学版)》 EI CAS CSCD 北大核心 2009年第9期1568-1573,共6页 Journal of Zhejiang University:Engineering Science
基金 浙江省自然科学基金资助项目(Y107435) 杭州市市属高校重点实验室科技创新资助项目(20080431T08)
关键词 蚁群优化算法 神经网络 选择性集成 ant conlony optimization(ACO) neural network selective ensemble
  • 相关文献

参考文献28

  • 1HANSEN L K, SALAMON P. Neural network ensembles[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990, 12(10) : 993 - 1001.
  • 2SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning, 1990, 5(2) : 197 - 227.
  • 3FREUND Y. Boosting a weak algorithm by majority [J]. Information and Computation, 1995, 121 (2): 256 - 285.
  • 4BREIMAN L. Bagging predictors[J]. Machine Learning, 1996, 24(2): 123-140.
  • 5HO T K. The random subspace method for constructing decision forests[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1998, 20(8): 832- 844.
  • 6WANG Yao-nan, ZHANG Dong-bo, HUANG Hui- xian. Neural network ensemble based on rough sets reduction and selective strategy[C]// Proceedings of 7th World Congress on Intelligent Control and Automation. Chongqing: IEEE, 2008: 2033-2038.
  • 7LIN Jian, ZHU Bang-zhu. Neural network ensemble based on feature selection[C] // 2007 IEEE International Conference on Control and Automation. Guangzhou: IEEE, 2007: 1844-1847.
  • 8ISLAM M M, YAO X, MURASE K. A constructive algorithm for training cooperative neural network ensembles[J] . IEEE Transactions on Neural Networks, 2003, 14(4) : 820- 834.
  • 9LIU Yong, YAO Xin. A cooperative ensemble learning system[C] // IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on Neural Networks. Anchorage, Alaska: IEEE, 1998: 2202 - 2207.
  • 10LIU Yong, YAO Xin. Negatively correlated neural networks for classification[C] // Proceeding of the 3rd International Symposium on Artificial Life and Robotics (AROBIII- 98). Beppu, Japan: [s. n.], 1998: 736-739.

二级参考文献31

  • 1L.K. Hansen, P. Salamon. Neural network ensembles. IEEE Trans. Pattern Analysis and Machine Intelligence, 1990, 12(10): 993~1001.
  • 2P. Sollich, A. Krogh. Learning with ensembles: How over-fitting can be useful. In: D. Touretzky, M. Mozer, M. Hasselmo, eds.Advances in Neural Information Processing Systems, Vol 8.Cambridge, MA: MIT Press, 1996. 190~196.
  • 3L. Breiman. Bagging predictors. Machine Learning, 1996, 24(2): 123~140.
  • 4Y. Freund, R. Schapire. Experiments with a new boosting algorithm. In: Proc. the 13th Int'l Conf. Machine Learning.Bari, Italy: Morgan Kaufmann, 1996.
  • 5A. Krogh, J. Vedelsby. Neural network ensembles, cross validation, and active learning. In: G. Tesauro, D. S.Touretzky, T. K. Leen, eds. Advances in Neural Information Processing Systems 7. Cambridge, MA: MIT Press, 1995. 231~238.
  • 6T. Dietterich, G. Bakin. Solving multiclass learning problems via error-correcting output codes. Journal of AI Research, 1995, 2,263~ 286.
  • 7N. C. Oza, K. Tumer. Dimensionality reduction through classifier ensembles. NASA Ames Research Center, Tech. Rep.:NASA-ARC- IC-1999-126, 1999.
  • 8N. C. Oza, K. Tumer. Input decimation ensembles:Decorrelation through dimensionality reduction. In: J. Kittler, F.Roli, eds. Multiple Classifier Systems. Second InternationalWorkshop (MCS 2001), LNCS 2096. Berlin: Springer, 2001.238~ 247.
  • 9Z. Zheng, G. Webb. Integrating boosting and stochastic attribute selection committees for future improving the performance of decision tree learning. The 10th IEEE ICTAI, Los Alamitos,1998.
  • 10Z.H. Zhou, J. X. Wu, W. Tang. Ensembling neural networks:Many could be better than all. Artificial Intelligence, 2002, 137(1-2): 239~263.

共引文献40

同被引文献131

引证文献7

二级引证文献155

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部