摘要
非重要权重元素的修剪和重新激活可避免神经网络过度参数化,然而权重元素的重新激活一般是通过激活整个滤波器实现,分类准确率不高。针对该问题,在神经网络训练过程中提出一种滤波器权值竞争训练算法。在局部和全局范围内选择并定位劣质滤波器,根据前向匹配策略寻找相应的优质滤波器,使用其中的最优和次优权重元素交叉更新劣质滤波器中的次劣和最劣权重元素,在神经网络结构上使陷入局部极值的权值进行重新激活。实验结果表明,应用滤波器权值竞争训练算法的ResNet、DenseNet等普通神经网络在CIFAR数据集上的分类准确率和在ImageNet数据集上的Top-1准确率平均提升了0.79和1.13个百分点,MobileNet、ShuffleNet等轻量级神经网络平均提升了2.22和2.93个百分点,优于现有的滤波器竞争训练算法。
Pruning and reactivation of unimportant weight elements can prevent overparameterization of neural networks.However,the reactivation of weight elements is typically achieved by activating the entire filter,where the classification accuracy is not high.To solve this problem,a filter weight competition training algorithm is proposed to train neural networks.In the local and global range,the low-quality filter is selected and located,and the corresponding high-quality filter based on the forward matching strategy is found.Optimal and suboptimal weight elements are used to cross-reference and update the subworst and worst weight elements in the low-quality filter.In the neural network,the weights trapped in local extrema are reactivated.Experimental results show that the classification accuracy of ordinary neural networks such as ResNet and DenseNet on the CIFAR dataset and the Top-1 accuracy on the ImageNet dataset trained by filter weight competition algorithm increase by 0.79 and 1.13 percentage points on average,respectively,as compared with the benchmark network.Similarly,lightweight neural networks such as MobileNet and ShuffleNet increase by 2.22 and 2.93 percentage points.The neural network image classification effect following filter weight competition training is thus better than the existing filter competition training algorithm.
作者
安志国
彭政
易满成
刘健欣
俞思帆
AN Zhiguo;PENG Zheng;YI Mancheng;LIU Jianxin;YU Sifan(Guangzhou Power Supply Bureau of Guangdong Power Grid Co.,Ltd.,Guangzhou 510630,China)
出处
《计算机工程》
CAS
CSCD
北大核心
2023年第4期120-124,共5页
Computer Engineering
基金
中国南方电网有限责任公司科技项目(GZHKJXM20200058)。