期刊文献+

PRAP-PIM:A weight pattern reusing aware pruning method for ReRAM-based PIM DNN accelerators

原文传递
导出
摘要 Resistive Random-Access Memory(ReRAM)based Processing-in-Memory(PIM)frameworks are proposed to accelerate the working process of DNN models by eliminating the data movement between the computing and memory units.To further mitigate the space and energy consumption,DNN model weight sparsity and weight pattern repetition are exploited to optimize these ReRAM-based accelerators.However,most of these works only focus on one aspect of this software/hardware codesign framework and optimize them individually,which makes the design far from optimal.In this paper,we propose PRAP-PIM,which jointly exploits the weight sparsity and weight pattern repetition by using a weight pattern reusing aware pruning method.By relaxing the weight pattern reusing precondition,we propose a similarity-based weight pattern reusing method that can achieve a higher weight pattern reusing ratio.Experimental results show that PRAP-PIM achieves 1.64×performance improvement and 1.51×energy efficiency improvement in popular deep learning benchmarks,compared with the state-of-the-art ReRAM-based DNN accelerators.
出处 《High-Confidence Computing》 2023年第2期50-59,共10页 高置信计算(英文)
基金 partially supported by the National Natural Science Foundation of China(92064008) the CCF-Huawei Huyanglin Project CCF-HuaweiST2021002 the Open Project Program of Wuhan National Laboratory for Optoelectronics(2022WNLOKF018) the Shandong Provincial Natural Science Foundation(ZR2022LZH010).

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部