期刊文献+

MW-OBS:An Improved Pruning Method for Topology Design of Neural Networks 被引量:1

MW-OBS:An Improved Pruning Method for Topology Design of Neural Networks
原文传递
导出
摘要 Topology design of artificial neural networks (ANNs) is an important problem for large scale applications. This paper describes a new efficient pruning method, the multi-weight optimal brain surgeon (MWOBS) method, to optimize neural network topologies. The advantages and disadvantages of the OBS and unit-OBS were analyzed to develop the method. Actually, optimized topologies are difficult to get within reasonable times for complex problems. Motivating by the mechanism of natural neurons, the MW-OBS method balances the accuracy and the time complexity to achieve better neural network performance. The method will delete multiple connections among neurons according to the second derivative of the error function, so the arithmetic converges rapidly while the accuracy of the neural network remains high. The stability and generalization ability of the method are illustrated in a Java program. The results show that the MWOBS method has the same accuracy as OBS, but time is similar to that of unit-OBS. Therefore, the MWOBS method can be used to efficiently optimize structures of neural networks for large scale applications. Topology design of artificial neural networks (ANNs) is an important problem for large scale applications. This paper describes a new efficient pruning method, the multi-weight optimal brain surgeon (MWOBS) method, to optimize neural network topologies. The advantages and disadvantages of the OBS and unit-OBS were analyzed to develop the method. Actually, optimized topologies are difficult to get within reasonable times for complex problems. Motivating by the mechanism of natural neurons, the MW-OBS method balances the accuracy and the time complexity to achieve better neural network performance. The method will delete multiple connections among neurons according to the second derivative of the error function, so the arithmetic converges rapidly while the accuracy of the neural network remains high. The stability and generalization ability of the method are illustrated in a Java program. The results show that the MWOBS method has the same accuracy as OBS, but time is similar to that of unit-OBS. Therefore, the MWOBS method can be used to efficiently optimize structures of neural networks for large scale applications.
出处 《Tsinghua Science and Technology》 SCIE EI CAS 2006年第3期307-312,共6页 清华大学学报(自然科学版(英文版)
基金 SupportedbytheNationalNaturalScienceFoundationofChina(Nos.70101008,70231010,70321001,and70471005)
关键词 neural networks topology design of artificial neural network pruning methods neural networks topology design of artificial neural network pruning methods
  • 相关文献

同被引文献11

  • 1Jani J T Lahnajarvi, Mikko I Lehtokangas, Jukka P P Saarinen. Evaluation of constructive neural networks with cascaded architectures[J]. Neurocomputing, 2002, 48(1-4): 573-607.
  • 2Bodyanskiy Y, Dolotov A, Pliss I, et al. The cascade orthogonal neural network[C]. The 6th Int Conf onInformation Research and Applications. Varna: FOI THEA-Publisher, 2008: 13-20.
  • 3Huynh T Q, Setiono R. Effective neural network pruning using cross-validation[C]. Int Joint Conf on Neural Networks. Washington: IEEE Inc, 2005: 972-977.
  • 4Konstantinos P F. Biological engineering applications of feedforward neural networks designed and parameterized by genetic algorithms[J]. Neural Networks, 2005, 18(7): 934-950.
  • 5Haralambos S, Alex A, Stefanos M, et al. A new algorithm for developing dynamic radial basis function neural network models based on genetic algorithms[J]. Computers and Chemical Engineering, 2004, 28(1/2): 209- 217.
  • 6Yu J B, Wang S J, Xi L E Evolving artificial neural networks using an improved PSO and DPSO[J]. Neurocomputing, 2008, 71(4-6): 1054-1060.
  • 7Singh Y P, RoyChowdhury E Dynamic tunneling based regularization in fecdforward neural networks[J]. Artificial Intelligence, 2001, 131(1/2): 55-71.
  • 8Gader P D, Khabou M A, Koldobsky A. Morphological regularization neural networks[J]. Pattern Recognition, 2000, 33(6): 935-944.
  • 9Nguyen M H, Abbass H A, McKay R I. Stopping criteria for ensemble of evolutionary artificial neural networks[J]. Applied Soft Computing, 2005, 6(1): 100-107.
  • 10Devroye L, Gyorfi L, Schafer D, et al. The estimation problem of minimum mean squared error[J]. Statistics and Decisions, 2003, 21(1): 15-28.

引证文献1

二级引证文献7

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部