摘要
为了消除深度神经网络中的冗余结构,找到具备较好性能和复杂度之间平衡性的网络结构,提出基于无标签的全局学习方法(LFGCL).LFGCL学习基于网络体系结构表示的全局剪枝策略,可有效避免以逐层方式修剪网络而导致的次优压缩率.在剪枝过程中不依赖数据标签,输出与基线网络相似的特征,优化网络体系结构.通过强化学习推断所有层的压缩率,采用深度确定性策略梯度算法探索最优网络结构.在多个数据集上的实验表明,LFGCL性能较优.
To remove redundant structures from deep neural networks and find a network structure with a good balance between capability and complexity,a label-free global compression learning method(LFGCL)is proposed.A global pruning strategy is learned based on the network architecture representation to effectively avoid the appearance of the suboptimal compression rate owing to network pruning in a layer-by-layer manner.LFGCL is independent from data labels during pruning,and the network architecture is optimized by outputting similar features with the baseline network.The deep deterministic policy gradient algorithm is applied to explore the optimal network structure by inferring the compression ratio of all layers through reinforcement learning.Experiments on multiple datasets show that LFGCL generates better performance.
作者
刘会东
杜方
余振华
宋丽娟
LIU Huidong;DU Fang;YU Zhenhua;SONG Lijuan(School of Information Engineering,Ningxia University,Yinchuan 750021;Collaborative Innovation Center for Ningxia Big Data and Artificial Intelligence Co-founded by Ningxia Municipality and Ministry of Education,Ningxia University,Yinchuan 750021)
出处
《模式识别与人工智能》
EI
CSCD
北大核心
2021年第3期214-222,共9页
Pattern Recognition and Artificial Intelligence
基金
国家自然科学基金项目(No.61901238)
宁夏自然科学基金项目(No.2018AAC03020,2018AAC03025)资助。