摘要
针对权衰减递推最小二乘算法(trueweightdecayRLS,TWDRLS)每迭代一步计算复杂度和存储要求很大,基于局部线性最小二乘算法(locallinearizedleastsquaresalgorithm,LLLS)与正则化因子,给出了多层前向神经网络带正则化因子的LLLS算法,大大减小了TWDRLS算法每迭代一步计算的复杂度和存储量。实验表明,改进的算法提高了原LLLS算法的鲁棒性和泛化能力,其性能接近TWDRLS算法。
The true weight decay RLS(TWDRLS) algorithm achieves a good performance at the expense of much greater computational complexity and storage requirements. A local linearized least squares algorithm(LLLS) together with regularizer is used for training multilayer feedforward neural networks. It can greatly decrease computational complexity and storage requirements. By simulation, it is proved that the modified algorithm can improve the robustness and generalization ability of LLLS. Its performance is approximate to that of the TWDRLS.
出处
《系统工程与电子技术》
EI
CSCD
北大核心
2004年第9期1312-1314,共3页
Systems Engineering and Electronics
关键词
正则化
递推最小二乘算法
泛化能力
局部线性最小二乘算法
regularization
recursive least squares algorithm
generalization ability
local linearized least squares algorithm