期刊文献+

基于特征图分块偏移的二值化卷积神经网络 被引量:1

Binarized Convolutional Neural Network Based on Feature Map Chunking Offset
下载PDF
导出
摘要 近年来,随着卷积神经网络(Convolutional Neural Network,CNN)的发展,网络模型变得更加庞大,网络结构更加复杂,这限制了其在性能受限的嵌入式平台上的发展。针对这个问题,研究人员提出了一种模型二值化的方法,大幅度提升了网络存储和计算效率,但同时带来了信息损失,导致模型准确率下降。针对二值化卷积神经网络中信息损失问题,提出了两种解决方法。首先提出最大最小池化,使降采样层能够同时保留正值信息和负值信息;其次对特征图二值激活过程中信息丢失的原因进行了分析,并提出了基于特征图分块偏移的二值化方法,有效保留了特征图在二值量化后的局部信息;最后通过实验表明所提方法有效提升了二值化网络模型的性能。 In recent years,with the development of CNN(Convolutional Neural Network),the network model becomes larger and the network structure is more complex,which limits its development on performance-constrained embedded platforms.To address this problem,researchers proposed a model binarization approach,which substantially improved network storage and computational efficiency,but at the same time brought information loss,leading to a decrease in model accuracy.This paper proposes two solutions to solve the information loss problem in binarized convolutional neural networks.First,it proposes max-min Pooling,which enables the downsampling layer to retain both positive and negative information.Then,it analyzes the reasons of information loss during binary activation of feature maps,and puts forward a binarization method based on feature map chunking offset,which effectively preserves the local information of feature maps after binarization.Finally,experiments indicate that the proposed method can effectively improve the performance of the binarized network model.
作者 张邦源 沈韬 曾凯 ZHANG Bangyuan;SHEN Tao;ZENG Kai(Kunming University of Science and Technology,Kunming Yunnan 650500,China)
机构地区 昆明理工大学
出处 《通信技术》 2022年第7期850-858,共9页 Communications Technology
基金 国家自然科学基金(61971208) 云南省中青年学术技术带头人后备人才(沈韬,2019HB005) 云南省万人计划青年拔尖人才(沈韬,朱艳,云南省人社厅201873) 云南省重大科技专项(02002AB080001-8),光合基金B类(20220202,ghfund202202022131)。
关键词 二值卷积神经网络 深度学习 车辆分类 梯度近似 binary convolutional neural network deep learning vehicle classification gradient approximation
  • 相关文献

同被引文献2

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部