期刊文献+

基于强化学习的无标签网络剪枝 被引量:3

Label-Free Network Pruning via Reinforcement Learning
下载PDF
导出
摘要 为了消除深度神经网络中的冗余结构,找到具备较好性能和复杂度之间平衡性的网络结构,提出基于无标签的全局学习方法(LFGCL).LFGCL学习基于网络体系结构表示的全局剪枝策略,可有效避免以逐层方式修剪网络而导致的次优压缩率.在剪枝过程中不依赖数据标签,输出与基线网络相似的特征,优化网络体系结构.通过强化学习推断所有层的压缩率,采用深度确定性策略梯度算法探索最优网络结构.在多个数据集上的实验表明,LFGCL性能较优. To remove redundant structures from deep neural networks and find a network structure with a good balance between capability and complexity,a label-free global compression learning method(LFGCL)is proposed.A global pruning strategy is learned based on the network architecture representation to effectively avoid the appearance of the suboptimal compression rate owing to network pruning in a layer-by-layer manner.LFGCL is independent from data labels during pruning,and the network architecture is optimized by outputting similar features with the baseline network.The deep deterministic policy gradient algorithm is applied to explore the optimal network structure by inferring the compression ratio of all layers through reinforcement learning.Experiments on multiple datasets show that LFGCL generates better performance.
作者 刘会东 杜方 余振华 宋丽娟 LIU Huidong;DU Fang;YU Zhenhua;SONG Lijuan(School of Information Engineering,Ningxia University,Yinchuan 750021;Collaborative Innovation Center for Ningxia Big Data and Artificial Intelligence Co-founded by Ningxia Municipality and Ministry of Education,Ningxia University,Yinchuan 750021)
出处 《模式识别与人工智能》 EI CSCD 北大核心 2021年第3期214-222,共9页 Pattern Recognition and Artificial Intelligence
基金 国家自然科学基金项目(No.61901238) 宁夏自然科学基金项目(No.2018AAC03020,2018AAC03025)资助。
关键词 深度神经网络(DNN) 网络剪枝 网络架构搜索 强化学习 Deep Neural Network(DNN) Network Pruning Network Architecture Search Reinforcement Learning
  • 相关文献

参考文献1

二级参考文献3

共引文献136

同被引文献13

引证文献3

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部