期刊文献+

基于λ-递增函数的样本学习

Sample Learning Based on λ-increasing Function
下载PDF
导出
摘要 在理论研究和实际应用中,神经网络的结构问题一直是个难点.本文利用Vugar E.Ismailov近期的研究成果,讨论了神经网络对样本点的学习问题.结果表明,利用λ-严格递增函数,只需两个隐层节点,就可以学会任意给定的样本集.同时讨论了在隐层节点中使用通常的Sigmoid函数与使用λ-严格递增函数作为活化函数的差别. How to determine the structure of a neural network is often an aporia in theoretical research and practical application. Based on the recent research of Vugar E. Ismailov, this paper study the learning method of sample points in neural network. The results show that with the λ-strictly increasing function, any specified sample set can be learnt by using only two neurons in hidden layer. The differences between using the usual Sigmoid function and λ-strictly increasing function as active function in the hidden layer are presented as well.
出处 《数学理论与应用》 2016年第4期92-105,共14页 Mathematical Theory and Applications
关键词 神经网络 网络结构 λ-递增函数 SIGMOID函数 Neural network Neural network structure λ-increasing function Sigmoid function
  • 相关文献

参考文献4

二级参考文献30

  • 1田大钢.前馈神经网络的学习能力[J].系统工程理论与实践,2004,24(11):76-81. 被引量:3
  • 2Poggio T, Girosi F. Regularization algorithms for learning that are equivalent to multilayer networks[J]. Science, 1990, 247: 978-982.
  • 3Scarselli F, Tsoi A C. Universal approximation using feedforward neural networks: A survey of some existing methods, and some newresults[J]. Neural Networks, 1998, 11, 15-37.
  • 4Tamura S, Tateishi M. Capabilities of a four-layered feedforward neural network: Four layers versus three[J]. IEEE Trans Neural Networks, 1997, 8: 251-255.
  • 5Sartori M A, et al. A simple method to derive bounds on the size and to train multilayer neural networks[J]. IEEE Trans Neural Networks, 1991, 2: 467-471.
  • 6Huang G B, Learning capability and storage capacity of two-hidden-layer feedforward networks[J], IEEE Trans Neural Networks, 2003, 14: 274-281.
  • 7Huang G B, Babri H A. Upper bounds on the number of hidden neurons in feedforward networks with arbitrary hounded nonlinear activation functions[J]. IEEE Trans Neural Networks, 1998, 9: 224-229.
  • 8Huang S C, Huang Y F. Bounds on number of hidden neurons in multilayer perceptrons[J]. IEEE Trans Neural Networks, 1991, 2: 47-55.
  • 9Baum E B, Hausslet D. What size net give valid generalization.? [J]. Neural Comput, 1989, 1 : 151-160.
  • 10Cheng Xiang, Shenqiang Q Ding, Tong Heng Lee. Geometrical interpretation and architecture selection of MLP[J]. IEEE Trans Neural Networks, 2005,16: 84-96.

共引文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部