摘要
“剪枝算法”是一种通过简化神经网络结构来避免网络过拟合的有效方法之一。将权值拟熵作为惩罚项加入目标函数中,使多层前向神经网络在学习过程中自动约束权值分布,并以权值敏感度作为简化标准,避免了单纯依赖权值大小剪枝的随机性。由于在剪枝过程中只剪去数值小并且敏感度低的连接权,所以网络简化后不需要重新训练,算法效率明显提高。仿真结果证明上述方法算法简单易行,并且对前向神经网络的泛化能力有较好的改善作用。
A common method for combating over - fitting problem is to apply pruning to reduce the number of unnecessary weights. By introducing the pseudo - entropy of weights as a penalty - term into the normal objective function, the distribution of weights is constrained during training, and in the training process, the sensitivity of weights is served as the criteria of pruning to avoid the randomicity of pruning only by the size of the weights. The small connections that have the smaller sensitivity value will be pruned therefore it is very effective because no retrain is required after pruning. The simulation result shows that it is simple, cheap to be implemented and the generalization of feed - forward neural networks trained by the proposed algorithm is greatly improved.
出处
《计算机仿真》
CSCD
2006年第3期110-112,共3页
Computer Simulation
关键词
多层前向神经网络
剪枝算法
目标函数
权值拟熵
权值敏感度
Multilayer neural networks
Pruning algorithm
Objective function
Pseudo - entropy of weighats
Sensitivity of weights.