摘要
在理论研究和实际应用中,神经网络的结构问题一直是个难点.本文利用Vugar E.Ismailov近期的研究成果,讨论了神经网络对样本点的学习问题.结果表明,利用λ-严格递增函数,只需两个隐层节点,就可以学会任意给定的样本集.同时讨论了在隐层节点中使用通常的Sigmoid函数与使用λ-严格递增函数作为活化函数的差别.
How to determine the structure of a neural network is often an aporia in theoretical research and practical application. Based on the recent research of Vugar E. Ismailov, this paper study the learning method of sample points in neural network. The results show that with the λ-strictly increasing function, any specified sample set can be learnt by using only two neurons in hidden layer. The differences between using the usual Sigmoid function and λ-strictly increasing function as active function in the hidden layer are presented as well.
出处
《数学理论与应用》
2016年第4期92-105,共14页
Mathematical Theory and Applications