期刊文献+

一种基于神经网络复杂度的修剪算法 被引量:10

A pruning algorithm based on neural complexity
原文传递
导出
摘要 针对神经网络结构设计问题,提出一种基于神经网络复杂度的修剪算法.其实质是在训练过程中,利用网络连接权矩阵的协方差矩阵计算网络的信息熵,获得网络的复杂度;在保证网络信息处理能力的前提下,删除对网络复杂度影响最小的隐节点.该算法不要求训练网络到代价函数的极小点,适合在线修剪网络结构,并且避免了结构调整前的网络权值预处理.通过对典型函数逼近的实验结果表明,该算法在保证网络逼近精度的同时,可有效地简化网络结构. For the design of the neural network architecture,a pruning algorithm based on the neural complexity is proposed.The essence is to calculate the entropy of neural network by the standard covariance matrix of the neural network's connection matrix in the process of training,and the network's complexity can be acquired.In the premise of ensuring the information processing capacity of neural network,the least important hidden node is deleted.It is not necessary to train the cost function of the neural network to a local minimal,suitable for pruning neural network architecture on-line,and the pre-processing neural network weights are avoided before architecture adjustment of the neural network.The simulation results of the typical function approximation show that the precision of the approximation is ensured and at the same time a simple architecture of neural networks can be achieved.
出处 《控制与决策》 EI CSCD 北大核心 2010年第6期821-824,830,共5页 Control and Decision
基金 国家自然科学基金项目(60873043) 国家863计划项目(2007AA04Z160 2009AA04Z155) 教育部博士点基金项目(200800050004) 北京市自然科学基金项目(4092010)
关键词 修剪算法 神经复杂度 互信息熵 Pruning algorithm Neural complexity Mutual information entropy
  • 相关文献

参考文献12

  • 1李明爱,乔俊飞,阮晓钢.基于递归神经网络的移动域控制方法[J].控制与决策,2006,21(8):918-922. 被引量:1
  • 2Cun Y L, Denker J S, Solla S A. Optimal brain damage[J]. Advances in Neural Information Processing Systems, 1990, (2): 598-605.
  • 3Hassibi B, GStork D. Second order derivatives for network pruning: Optimal brain surgeon[J]. Advances in Neural Information Processing Systems, 1993, (5): 164-171.
  • 4Qiao Jun-fei, Zhang Ying, Hart Hong-gui. Fast unit pruning algorithm for feed-forward neural network design[J]. Applied Mathematics and Computation, 2008, 205(2): 662-667.
  • 5Xu Jinhua, Daniel W C Ho. A new training and pruning algorithm based on node dependence and Jacbian rank deficiency[J]. Neurocomputing, 2006, 70(1-3): 544-558.
  • 6Philippe Lauret, Eric Fock, Thierry Alex Mara. A node pruning algorithm based on a fourier amplitude sensitivity test method[J]. IEEE Trans on Neural Networks, 2006, 17(2): 273-293.
  • 7Erdogmus D, Principe J C. Generalized information potential criterion for adaptive system training[J]. IEEE Trans on Neural Networks, 2002, 13(5): 1035-1044.
  • 8Marc M Van Hulle. Joint entropy maximization in kernelbased topographic maps[J]. Neural Computation, 2002, 14(8): 1887-1906.
  • 9郭伟,张昭昭.熵在BP神经网络修剪算法中的应用[J].信息与控制,2009,38(5):633-636. 被引量:2
  • 10Spons O, Tononi G, Edelman G M. Connectivity and complexity: The relationship between neuroanatomy and brain dynamics[J]. Neural Networks, 2000, 13(9): 909- 922.

二级参考文献20

  • 1赵强福.神经网络用于二次优化存在的问题及解决方法[J].北京理工大学学报,1994,14(1):1-5. 被引量:4
  • 2刘超彬,乔俊飞.污水处理过程中对泥龄的模糊神经网络控制[J].信息与控制,2006,35(1):16-20. 被引量:6
  • 3杨慧中,王伟娜,丁锋.神经网络的两种结构优化算法研究[J].信息与控制,2006,35(6):700-704. 被引量:11
  • 4Qiao J F, Zhang Y, Han H G. Fast unit pruning algorithm for feedforward neural network design[J]. Applied Mathematics and Computation, 2008, 205(2): 622-627.
  • 5Erdogmus D, Principe J C. An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems[J]. IEEE Transactions on Signal Processing, 2002, 50(7): 1780-1786.
  • 6Schraudolph N N. Gradient-based manipulation of nonparametric entropy estimates[J]. IEEE Transactions on Neural Networks, 2004, 15(4): 828-837.
  • 7Erdogmus D, Pfincipe J C. Generalized information potential criterion for adaptive system training[J]. IEEE Transactions on Neural Networks, 2002, 13(5): 1035- 1044.
  • 8Ozertem U, Uysal I, Erdogmus D. Continuously differentiable sample-spacing entropy estimation[J]. IEEE Transactions on Neural Networks, 2008, 19(11): 1978-1984.
  • 9Battiti R. Using mutual information for selecting features in supervised neural net learning[J]. IEEE Transactions on Neural Networks, 1994, 5(4): 537-550.
  • 10Deco G, Finnoff W, Zimmermann H G. Unsupervised mutual information criterion for elimination of overtraining in supervised multilayer networks[J]. Neural Computation, 1995, 7(1): 86-107.

共引文献1

同被引文献105

  • 1李倩,王永县,朱友芹.人工神经网络混合剪枝算法[J].清华大学学报(自然科学版),2005,45(6):831-834. 被引量:7
  • 2胡包钢,王泳,杨双红,曲寒冰.如何增加人工神经元网络的透明度?[J].模式识别与人工智能,2007,20(1):72-84. 被引量:11
  • 3任小康,吴尚智,马如云.基于可辨识矩阵的属性频率约简算法[J].兰州大学学报(自然科学版),2007,43(1):138-140. 被引量:26
  • 4张剑英,程健,侯玉华,白静宜,裴小斐.煤矿瓦斯浓度预测的ANFIS方法研究[J].中国矿业大学学报,2007,36(4):494-498. 被引量:34
  • 5MA L,KHORASANI K. Constructive feedforward neural networks using Hermite poly nomial activation function[J]. IEEE Transactions on Neural Network, 2005, 16(4): 821833.
  • 6ISLAM Monirual, SATTAR A, AMIN F, YAO Xin, MURASE K. A new adaptive merging and growing algorithm for designing artificial neural networks[J]. IEEE Transactions on Systems, Man, and Cybernetics—Part B: Cybernetics, 2009, 39(3): 705722.
  • 7KRASKOV A, STOGBAUER H, GRASSBERGER P. Estimating mutual information[J]. Phys Rev E, Sta Plasmas Fluids Relat Interdiscip Top, 2004, 69( 0661138): 1-16.
  • 8HONG Jie, HU Baogang. Twophase construction of multilayer perceptions using information theory[J]. IEEE Transactions on Neural Network, 2009, 20(4): 542-550.
  • 9]LIU Yinyin, STARZYK J A, ZHU Zhen. Optimized approximation algorithm in neural networks without overfitting[J]. IEEE Transactions on Neural Network, 2008, 19(6): 983-995.
  • 10HASSIBI B, STORK D, WOLFF G, WATANABE T. Optimal brain surgeon: extensions and performance comparisons[C]//Adavances in Neural Informati on Processing Systems 6. San Mateo, USA: Morgan Kaufman, 1994: 263-270.

引证文献10

二级引证文献73

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部