期刊文献+

复数神经网络的一种新型初始权值选择方法 被引量:1

Algorithm of choosing initialized weights for training complex-valued neural networks
下载PDF
导出
摘要 为了改善学习速率,提出了一种确定复数神经网络初始权值的新颖方法。初始权值不是随机给定的,而是通过计算求得。具体方法是选择一类隐层神经元的变换函数(类支集函数),将输入层和隐层之间的复数权值计算出来,保证隐层的输出矩阵是满秩矩阵,并从理论上证明了这样的满秩矩阵是存在的。利用这个满秩矩阵,通过最小平方算法就可以求得隐层和输出层之间的复数权值。将这些权值作为初始权值,采用最速下降算法来对神经网络进行训练。初始权值的优化,使得该算法可以有效地提高复数神经网络的训练速度和计算精度。一个特例是当隐层神经元的个数与样本个数相等时,就可以求得代价函数值为0的全局最小点。计算机仿真实例验证了该算法的有效性。 To improve learning speed, a novel method for properly initializing the parameters (weights) of training complex-valued neural networks is proposed. The complex-valued weights between hidden and output layers are not randomly preassigned, but calculated to guarantee that the output matrix of hidden layer is full rank by using a kind of activation functions in hidden layer. Theoretically, it is proved that the full-rank matrix of hidden layer exists. The full-rank matrix is employed to find the complex-valued weights between hidden and output layers by the least mean square algorithm. These weights are used as initialized weights, and then, the steepest descent approach is introduced for training the networks. Because the initialized weights are optimized, the training accuracy and the learning speed are improved a lot for training complex-valued neural networks. In particular, when the number of neurons in hidden layer equals the number of training patterns, the global minima with zero cost functions are obtained. Computer simulations show the good performance of the algorithm.
作者 张代远
出处 《系统工程与电子技术》 EI CSCD 北大核心 2006年第6期929-932,共4页 Systems Engineering and Electronics
关键词 人工智能 复数神经网络 高训练精度 快速学习 artificial intelligence complex-valued neural networks high training accuracy fast learning
  • 相关文献

参考文献5

  • 1Hagan M T,Demuth H B,Beale M.Neural network design[M].北京:机械工业出版社(英文版),2002.9-1-9-37.
  • 2Leung H,Haykin S.The complex backpropagation algorithm[J].IEEE Transactions on Signal Processing,1991,39(9):2101-2104.
  • 3Benvenuto N,Piazza F.On the complex backpropagation algorithm[J].IEEE Transactions on Signal Processing,1992,40(4):967-969.
  • 4张代远,王绍棣.一种复前馈神经网络的新算法[J].系统工程与电子技术,2000,22(4):36-38. 被引量:3
  • 5Ergezinger S,Tomsen E.An accelerated learning algorithm for multilayer perceptrons:optimization layer by layer[J].IEEE Transactions on Neural Networks,1995,6(1):31-42.

二级参考文献3

共引文献2

同被引文献9

  • 1George M. Georgiou, Cris Koutsougeras. Complex domain backpropagation[J].IEEE Transactions on Circuits and Sys- tems II ,1992,39(5) :330-334.
  • 2Widrow B. , McCool J., Ball M. The complex LMS algorithm [J].Proc. IEEE, 1975,63: 719-720.
  • 3Leung H, Haykin S. The complex backpropagation algoritiam [J]. I EEE Transactions on Signal Processing, 1991,39(9): 2101-2104.
  • 4Benvenuto N, Piazza F. On the complex backpropagation algo- rithm[J]. IEEE Transactions on Signal Processing, 1992,40 (4) :967-969.
  • 5Tohru Nitta. An extension of the back-propagation algorithm to complex numbers[J]. Neural Networks, 1997, 10: 1391- 415.
  • 6D. E. Rumelhart, G. E. Hinton, R. J. Williams. Learning internal representations by error propagation[J].In Parallel Distributed Processing, 1986 (1) :318-362.
  • 7张代远,王绍棣.一种复前馈神经网络的新算法[J].系统工程与电子技术,2000,22(4):36-38. 被引量:3
  • 8唐普英,李绍荣,黄顺吉.一种新的复值递归神经网络训练方法及其应用[J].信号处理,2001,17(6):515-520. 被引量:1
  • 9王军锋,张彬,宋国乡.复数RBF神经网络自适应均衡算法研究[J].系统工程与电子技术,2003,25(7):848-850. 被引量:7

引证文献1

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部