期刊文献+

MISO多元广义多项式神经网络及其权值直接求解 被引量:7

MISO Multivariate Generalized Polynomials Neural Network and its Weights-Direct-Determination
下载PDF
导出
摘要 基于多元函数逼近理论,构建一种M ISO(Mu ltip le-Input,S ingle-Output)多元广义多项式神经网络。依据最小二乘原理,推导出基于伪逆的最优权值一步计算公式———简称为权值直接确定法;在此基础上,提出基于指数增长和折半删减搜索策略的隐神经元数自适应增删搜索算法。该新型神经网络具有结构简单的优点,其权值直接确定法、隐神经元增删算法可以避免冗长的迭代计算、局部极小点和学习率难选取等问题,同时解决了传统BP神经网络难以确定隐神经元数这一难题。仿真实验显示其具有训练速度快、逼近精度高和良好的去噪特性等特点。 A new type of MISO (Multiple-Input, Single-Output) multivariate generalized polynomials neural network is constructed based on multivariate function approximation theory. According to least square theorem, a pseudoinverse-based weights-direct-determination method is further presented to deter- mine the neural-weights just in one step. Moreover, on the basis of this weights-direct-determination, a hidden-layer evolution algorithm is proposed based on exponential-groWth and binary-delete-search strate- gy. Theoretical analysis demonstrates that, since the weights-direct-determination method and the hidden- layer evolution algorithm could obtain the optimal weights directly without lengthy iterative BP-training, the constructed neural network could remedy the weakness of conventional BP neural networks, such as the existence of local-minima, choosing of learning-rate as well as the determination of the hidden-layer neurons. Computer simulation results substantiate the advantages of weights-direct-determination method and hidden-layer evolution algorithm for the constructed neural network, in the sense of training speed and high approximation precision.
出处 《中山大学学报(自然科学版)》 CAS CSCD 北大核心 2009年第4期42-46,56,共6页 Acta Scientiarum Naturalium Universitatis Sunyatseni
基金 国家自然科学基金资助项目(60643004 60775050) 中山大学科研启动费 后备重点资助项目
关键词 多元广义多项式 权值直接确定 结构自适应确定 指数增长 折半删减 multivariate generalized polynomials weights-direct-determination structure-adaptive-de-termination exponential growth binary search
  • 相关文献

参考文献14

  • 1张雨浓,刘巍,易称福,李巍.Legendre正交基前向神经网络的权值直接确定法[J].大连海事大学学报,2008,34(1):32-36. 被引量:6
  • 2ZHANG Yunong, WANG Jun. Global exponential stability of recurrent neural networks for synthesizing linear feedback control systems via pole assignment [J]. IEEE Transactions on Neural Networks, 2002, 13 ( 3 ) :633 - 644.
  • 3杨国良,吴旷怀.利用BP神经网络反算沥青路面结构层弹性模量的研究[J].中山大学学报(自然科学版),2008,47(5):44-48. 被引量:9
  • 4张雨浓,张禹珩,陈轲,蔡炳煌,马伟木.线性矩阵方程的梯度法神经网络求解及其仿真验证[J].中山大学学报(自然科学版),2008,47(3):26-32. 被引量:8
  • 5RUMELHART David, McCIELLAND James. Parallel distributed processing: Explorations in the microstructure of cognition [M]. Cambridge: MIT Press, 1986.
  • 6YANG Xuhua, DAI Huaping, SUN Youxian. SIMO Fourier neural networks research [ C ]. Proceedings of IEEE Intelligent Transportation Systems, 2003, 2 : 1606 - 1609.
  • 7谢宏,程浩忠,牛东晓,张国立.前向神经网络的神经元分层逐个线性优化快速学习算法[J].电子学报,2005,33(1):111-114. 被引量:5
  • 8DUNKL Charles F, YUAN Xu. Orthogonal polynomials of several variables [ M ]. Cambridge University Press, 2001.
  • 9LIANG Xuebin, TSO Shukit. An improved upper bound on step-size parameters of discrete-time recurrent neural networks for linear inequality and equation system [ J ].IEEE Transactions on Circuits and Systems, 2002, 49 (5) : 695-698.
  • 10LIU Puyin. Mamdani fuzzy system: universal approximator to a class of random processes [ J ]. IEEE Transactions on Fuzzy Systems, 2002, 10 (6) : 756-766.

二级参考文献51

  • 1苏小红,王亚东,马培军.基于反馈调控参数的BP学习算法研究[J].哈尔滨工业大学学报,2005,37(10):1311-1314. 被引量:5
  • 2邹阿金,罗移祥.L egender神经网络建模及股票预测[J].计算机仿真,2005,22(11):241-242. 被引量:7
  • 3章兢,邹阿金,童调生.多项式基函数神经网络模型[J].湖南大学学报(自然科学版),1996,23(2):84-89. 被引量:21
  • 4周昌能,余雪丽.基于BP网络的权值更新快速收敛算法[J].计算机应用,2006,26(8):1940-1942. 被引量:6
  • 5SELKOE D J.大脑衰老,智能减退[J].科学,1993,(1):20-27,100.
  • 6[2]HORNIK K,STINCHCOMBE M,WHITE H.Multi layer feedforwark and universal approx-imators[J].Neural Network,1998,(5):359-366.
  • 7[3]PAO Y H,TAKEFJI Y.Function-link net computing[J].IEEE Computer journal,1992,16(2):76-79.
  • 8[4]HIROSE Y,YAMGGHITA K,UIJIYA S.Back-propagation algorithm which varies the number of hidden units[J].Neural Networks,1991,(4):61-66.
  • 9[7]沈 清,胡德文.神经网络应用技术[M].长沙:国防科技大学出版社,1993.
  • 10S Ergezinger, E Thomsen. An accelerated learning algorithm formuhilayer perceptrons: optimization layer by layer[J]. IEEE Trans.Neural Networks, 1995,6(1) :31 - 42.

共引文献40

同被引文献70

引证文献7

二级引证文献24

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部