期刊文献+

DRIB:Interpreting DNN with Dynamic Reasoning and Information Bottleneck

原文传递
导出
摘要 The interpretability of deep neural networks has aroused widespread concern in the academic and industrial fields.This paper proposes a new method named the dynamic reasoning and information bottleneck(DRIB)to improve human interpretability and understandability.In the method,a novel dynamic reasoning decision algorithmwas proposed to reduce multiply accumulate operations and improve the interpretability of the calculation.The information bottleneck was introduced to the DRIB model to verify the attribution correctness of the dynamic reasoning module.The DRIB reduces the burden approximately 50%by decreasing the amount of computation.In addition,DRIB keeps the correct rate at approximately 93%.The information bottleneck theory verifies the effectiveness of this method,and the credibility is approximately 85%.In addition,through visual verification of this method,the highlighted area can reach 50%of the predicted area,which can be explained more obviously.Some experiments prove that the dynamic reasoning decision algorithm and information bottleneck theory can be combined with each other.Otherwise,the method provides users with good interpretability and understandability,making deep neural networks trustworthy.
出处 《国际计算机前沿大会会议论文集》 2022年第1期178-189,共12页 International Conference of Pioneering Computer Scientists, Engineers and Educators(ICPCSEE)
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部