摘要
介绍了应用非单调函数作为隐层节点的激励函数的前馈多层神经网络的性能。并对以前神经网络中的两个难点:局部极小和收敛速度慢的问题进行分析。表明了采用非单调激励函数的神经网络能够较好地解决局部极小的问题。最后的实验结果也表明了该网络的性能要好于采用S型函数的神经网络的性能。
This paper describes the performance of multi-layered neural network with hidden units of non-monotonic activation functions. Our previous work has shown that the network was effective in improving two difficulties, a convergence to local minimal and a slow learning speed. That shows non-monotonic network is suitable to solve local minimal problems. The results show that the networks are effective for the tasks and have the same generalization performance as the networks with the sigmoidal activation functions.
出处
《计算机工程》
CAS
CSCD
北大核心
2000年第4期4-5,8,共3页
Computer Engineering
基金
国家自然科学基金
关键词
神经网络
BP算法
激励函数
非单调函数
Artificial neural network
BP model
Activation function
Non-monotonic function