期刊文献+

基于延时反向传播算法的低复杂度滤波器研究 被引量:1

Low-Complexity Filter Based on Back-propagation Through Time Algorithm
下载PDF
导出
摘要 为了进一步减少管状双线性递归神经网络的计算复杂度,在管状双线性递归神经网络中采用了延时反向传播算法。延时反向传播算法使用了阶次微分,误差函数对权值微分进行后向计算。后向计算顺序降低了初始化要求,减弱了网络对初始化条件敏感性并降低了计算的复杂度。该网络采用了模块化设计,各个模块以并行的方式执行任务,改善了计算效率。基于管状双线性递归神经网络的结构与神经元的数学模型,提出了具体的延时反向传播算法实现方案。同时进行了仿真来评估滤波器在非线性系统辨识方面的性能。实验结果表明基于延时反向传播算法的管状双线性递归神经网络提供了相当好的性能。 In order to further reduce the computational complexity of the pipelined bilinear recurrent neural network ( P - BLRNN). The back - propagation through time (BPTT) algorithm is used into P- BLRNN. BPTT algorithm employs the order differential, and the error function calculates the weight differential to the back. The sequences of calculating to the back reduce the requirement of initialization. Weaken the network to initialization conditions sensitivity and reduces the computational complexity. Based on the structure and the mathematical model of neurons of P - BLRNN, the suitable implementation of the BPTT algorithm is presented. Simulations are carried out to evaluate the performance of nonlinear system identification based on the BPTr algorithm. Experimental re- suits show that the presented neural network filter provides better performance.
作者 周莲英 汤彧
出处 《无线通信技术》 2013年第4期1-6,共6页 Wireless Communication Technology
基金 "基于商务智能和数据集中管控的集团信息化服务平台" 2011年江苏省科技支撑项目(BE2011156)
关键词 神经网络滤波器 延时反向传播算法 递归神经网络 非线性系统识别 neural networks filter back - propagation through time algorithm recurrent neural net- work nonlinear system identification
  • 相关文献

参考文献13

  • 1V J Mathews and G L Sicuranza, Polynomial Signal Processing[M]. New York: Wiley, 2001.
  • 2S Haykin, Neural Networks: A Comprehensive Foundation[M]. Englewood Cliffs, NJ: Prentice-Hall, 1994.
  • 3Kim, Jae- Young , ;Park, Dong- Chul; Woo, Dong- Min. Applleation of pruned bilinear reeurrent neural net- work to load predietion[C]. 2010 ACS/IEEE International Conferenee on Computer Systems and Applications,2010.
  • 4D P Mandic and J A Chamhers, Recurrent Neural Net- works for Prediction [ M ]. Chichester, U. K. Wiley, 2001.
  • 5F J Pineda. Generalization of back - propagation to recur- rent neural networks[J]. Phys. Rev. Lett. vol. 59, no. 19, pp. 2229 -2232, 1987.
  • 6Xu Zhao , Song Qing, Wang Danwei . Recurrent neural tracking control based on multivariable robust adaptive gradient - descent training algorithm [ J ]. Neural Comput. App. Jun. 2011.
  • 7Wu Yilei , Song Qing, and Liu Sheng. A normalized a- daptive training of recurrent neural networks with augmen- ted error gradient [J]. IEEE Trans. Neural Netw. yd. 19, no. 2, pp. 351-356, Feb. 2008.
  • 8Z Liu and I. Elhanany. A fast and scalable recurrent neu- ral network based on stochastic meta descent. IEEE Trans [J]. Neural Netw. Sep. 2008.
  • 9D P Mandic and J A Chambers. Toward an optimal PRNN . based nonlinear pred.lctor [ J ]. IEEE Trans. Neural Netw. Nov. 1999.
  • 10E Trenfin. Networks with trainable amplitude of activation functions[J]. Neural Netw. May 2001.

同被引文献3

引证文献1

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部