摘要
本文首次从理论上给出了三层前馈神经网络代价函数全局最小值的计算公式。这一计算公式在网络训练之前就可根据已知的目标样本和隐层神经元个数来确定网络代价函数的全局最小值。并指出代价函数全局最小值随隐层神经元个数的增加而单调减小.当隐层神经元个数不小于样本个数时,网络的代价函数全局最小值将等于零。
In this pager, a theoretical formula is proposed which can be used to find the global minimum of the cost function (GMCF) for 3-layered feedforward neural networks when the target patterns and the number and the number of hidden neurons are preassigned before the training of the networks. The GMCF decreases monotonically as the number of hidden neurons increases and it would be zero if the number of hidden neurons is not less than the number of preassigned patterns.
出处
《信号处理》
CSCD
2001年第2期156-161,共6页
Journal of Signal Processing
关键词
代价函数最小值
全局最优
前馈神经网络
Minimum cost function Global minimum Feedforward neural networks