期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Gradient Amplification: An Efficient Way to Train Deep Neural Networks 被引量:6
1
作者 Sunitha Basodi Chunyan Ji +1 位作者 Haiping Zhang Yi Pan 《Big Data Mining and Analytics》 EI 2020年第3期196-207,共12页
Improving performance of deep learning models and reducing their training times are ongoing challenges in deep neural networks.There are several approaches proposed to address these challenges,one of which is to incre... Improving performance of deep learning models and reducing their training times are ongoing challenges in deep neural networks.There are several approaches proposed to address these challenges,one of which is to increase the depth of the neural networks.Such deeper networks not only increase training times,but also suffer from vanishing gradients problem while training.In this work,we propose gradient amplification approach for training deep learning models to prevent vanishing gradients and also develop a training strategy to enable or disable gradient amplification method across several epochs with different learning rates.We perform experiments on VGG-19 and Resnet models(Resnet-18 and Resnet-34),and study the impact of amplification parameters on these models in detail.Our proposed approach improves performance of these deep learning models even at higher learning rates,thereby allowing these models to achieve higher performance with reduced training time. 展开更多
关键词 deep learning gradient amplification learning rate backpropagation vanishing gradients
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部