Deep convolution neural networks are going deeper and deeper.How-ever,the complexity of models is prone to overfitting in training.Dropout,one of the crucial tricks,prevents units from co-adapting too much by randomly...Deep convolution neural networks are going deeper and deeper.How-ever,the complexity of models is prone to overfitting in training.Dropout,one of the crucial tricks,prevents units from co-adapting too much by randomly drop-ping neurons during training.It effectively improves the performance of deep net-works but ignores the importance of the differences between neurons.To optimize this issue,this paper presents a new dropout method called guided dropout,which selects the neurons to switch off according to the differences between the convo-lution kernel and preserves the informative neurons.It uses an unsupervised clus-tering algorithm to cluster similar neurons in each hidden layer,and dropout uses a certain probability within each cluster.Thereby this would preserve the hidden layer neurons with different roles while maintaining the model’s scarcity and gen-eralization,which effectively improves the role of the hidden layer neurons in learning the features.We evaluated our approach compared with two standard dropout networks on three well-established public object detection datasets.Experimental results on multiple datasets show that the method proposed in this paper has been improved on false positives,precision-recall curve and average precision without increasing the amount of computation.It can be seen that the increased performance of guided dropout is thanks to shallow learning in the net-works.The concept of guided dropout would be beneficial to the other vision tasks.展开更多
基金This work is supported by the National Natural Science Funds of China(Project No.U19B2036).
文摘Deep convolution neural networks are going deeper and deeper.How-ever,the complexity of models is prone to overfitting in training.Dropout,one of the crucial tricks,prevents units from co-adapting too much by randomly drop-ping neurons during training.It effectively improves the performance of deep net-works but ignores the importance of the differences between neurons.To optimize this issue,this paper presents a new dropout method called guided dropout,which selects the neurons to switch off according to the differences between the convo-lution kernel and preserves the informative neurons.It uses an unsupervised clus-tering algorithm to cluster similar neurons in each hidden layer,and dropout uses a certain probability within each cluster.Thereby this would preserve the hidden layer neurons with different roles while maintaining the model’s scarcity and gen-eralization,which effectively improves the role of the hidden layer neurons in learning the features.We evaluated our approach compared with two standard dropout networks on three well-established public object detection datasets.Experimental results on multiple datasets show that the method proposed in this paper has been improved on false positives,precision-recall curve and average precision without increasing the amount of computation.It can be seen that the increased performance of guided dropout is thanks to shallow learning in the net-works.The concept of guided dropout would be beneficial to the other vision tasks.