摘要
利用多维离散傅立叶变换原理构造新颖的神经网络模型用于函数逼近 ,网络结构为分层前向网络 .给出了网络的学习算法 ,网络的大部分权值都是固定的 ,只有输出层与最后隐层之间的权值需要调节 .与其他神经网络相比 ,学习算法大为简化 ,训练速度更快 .只要隐层节点数足够多 ,网络就可以以任意精度逼近任意连续函数 .通过计算机模拟与 BP网络和模糊神经网络进行了比较 ,发现收敛速度非常快 。
A novel class of layered feedforward neural network models for function approximation was proposed based on the principle of multi dimensional discrete Fourier transform. A learning algorithm was introduced, in which most connection weights of the network are fixed, only those between the output layer and the last hidden layer are needed to be adjusted. Compared with other neural networks, the algorithm is simpler and learning speed is faster. The network can approximate any continuous function with any degree of accuracy provided its hidden nodes is as many as enough. The computer simulation shows its advanteges of fast convergence and high approximation accuracy over the back propagation (BP) network and fuzzy neural networks.
出处
《上海交通大学学报》
EI
CAS
CSCD
北大核心
2000年第7期956-959,共4页
Journal of Shanghai Jiaotong University