期刊文献+

激励函数可学习神经网络 被引量:1

Active Functions Learning Neural Network
下载PDF
导出
摘要 提出了一种激励函数可学习神经网络,其神经元函数不固定,通常是任何线性无关的基函数的线性组合,通过调整神经元中基函数的系数即可达到网络学习的目的。为了结构优化方便,将神经元输出的多维空间映射为一维空间后输入给下层神经元。根据网络的特点,提出了两种无需迭代的网络参数快速学习算法实现网络训练。通过3个实例进行仿真实验,结果表明所设计神经网络的逼近能力强,参数学习速度极快。 In this paper,a novel neural network is proposed,whose active functions can be learned. Its active functions are not given and cannot be changed,but can be learned by the problems and could be the linear combination of any linear independent basis functions. The networks could be learned by tuning the coefficients of the basis functions. For the convenience of structure optimization,the input vectors of neurons are always mapped to be scalar variables. In this paper,two quick learning algorithms are proposed for this network which does not need iterative procedures. The results show that this network has good function approximation capacity and fast learning speed.
出处 《江南大学学报(自然科学版)》 CAS 2015年第6期689-694,共6页 Joural of Jiangnan University (Natural Science Edition) 
基金 国家自然科学基金项目(61273187) 国家自然科学基金创新研究群体科学基金项目(61321003)
关键词 神经网络 激励函数 快速学习方法 neural network active function quick learning algorithms
  • 相关文献

参考文献18

  • 1Sin-Chun Ng,Chi-Chung Cheung,Shu-hung Leung. Magnified gradient function with deterministic weight modification in adaptive learning[ J ]. IEEE Transactions on Neural Networks,2004,15 (6) : 1411-1423.
  • 2Zweiri Y H, Whidborne J F, Althoefer K, et al. A new three-term backpropagation ',algorithm with convergence analysis [ C ]// Proceedings of ICRA02. IEEE International Conference on Robotics and Automation. Washington DC:IEEE,2002,4:3882-3887.
  • 3Zweiri Y H, Seneviratne L D, Ahhoefer K. Stability analysis of a three-term backpropagation algorithm [ J ]. Neural Networks, 2005,18 (10) : 1341-1347.
  • 4Jim Y F Yam,Tommy W S Chow. A weight initialization method for improving training speed in feedforward neural network[ J ]. Neurocomputing ,2000,30( 1 ) :219-232.
  • 5Yat-Fung Yam, Chi-Tat Leung, Peter K S Tam, et al. An independent component analysis based weight initialization method tot multilayer perceptrons [ J ]. Neurocomputing ,2002,48 ( 1 ) : 807-818.
  • 6Castellano G, Fanelli A M, Pelillo M. An iterative pruning algorithm for feedforward neural networks [ J ]. IEEE Transactions o Neural Networks, 1997,8 ( 3 ) :519-531.
  • 7Scott Fahlman,Christian Lebiere. The cascade-correlation learning architecture [ J]. Advances in Neural Information Processing Systems, 1991,2 : 524 -532.
  • 8Lauret P, Fock E, Mara T A. A node pruning algorithm based on a Fourier amplitude sensitivity test method [ J ]. IEEE Transactions on Neural Networks ,2006,17 (2) :273-293.
  • 9Zhihong Yaoa, Minrui Feia, Kang Lic, et al. Recognition of blue-green algae in lakes using distributive genetic algorithm-bascd neural networks [ J ]. Neurocomputing, 2007,70 ( 4 ) :64 l - 647.
  • 10Leung F H F,Lam H K,Ling S H,et al. Tuning of the structure and parameters of a neural network using an improved genetic algorithm[J]. IEEE Transactions on Neural Networks,2003,14( 1 ) :79-88.

同被引文献10

引证文献1

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部