摘要
随机双梯度算法是独立分量分析中一个重要的学习算法,但该算法收敛速度慢,稳态误差大,不利于信号的准确适时性处理.提出一种改进的新型随机双梯度算法.在改进的算法中,用负熵来度量其中的随机变量非高斯性,从而来克服峭度的不稳健性.仿真实验结果表明,这种改进的随机双梯度算法具有较好的分离效果,且稳定性较高.
Stochastic Dual-Gradient Algorithm is an important learning algorithm of independent compo- nent analysis, but its convergence speed is slow and its steady-state error is large, so that it is not conducive to accurately and duly process signals. Focusing on the improvement of stochastic dual-gradient algo- rithm, a stochastic dual-gradient algorithm based on negative entropy was proposed. Negative entropy was used to measure the non gaussian of random variables and thus to overcome the kurtosis of robustness in the improved algorithm. The experimental results show that the improved Stochastic Dual-Gradient Algorithm has better separation effect and higher stability.
出处
《兰州理工大学学报》
CAS
北大核心
2014年第2期110-113,共4页
Journal of Lanzhou University of Technology
基金
四川省教育厅青年基金(10ZC102)
关键词
随机双梯度算法
独立分量分析
负熵
峭度
stochastic dual-gradient algorithm
independent component analysis
negative entropy
kurtosis