摘要
针对用于图像分类的卷积神经网络中全连接层的参数较多且易产生过拟合的问题,为了减少其存储与计算的开销,对网络的结构进行精简,使用卷积核大小为1×1的卷积层及全局平均池化层来代替传统卷积神经网络架构中的全连接层。实验结果表明,在相同的数据集中,压缩后的卷积神经网络相比于原网络,占用的存储空间更少,并取得了略好的性能。
To alleviate the complex calculation and storage overfitting of convolutional neural network applied to image classification,a compressed network architecture is proposed by substituting the overfitting-prone fully connected layer with 1 × 1 convolutional layer and global average pooling.Experiments show that the compressed network requires less storage space and performs as well as or slightly better than the original one in the training and classification task on CIFAR-10 dataset.
作者
叶子
肖诗斌
YE Zi;XIAO Shibin(Computer School, Beijing Information Science & Technology University, Beijing 100101, China;Beijing TRS Information Technology Co. , Ltd, Beijing 100101, China)
出处
《北京信息科技大学学报(自然科学版)》
2018年第3期52-56,共5页
Journal of Beijing Information Science and Technology University
基金
863计划课题(2015AA015409)
关键词
卷积神经网络
图像分类
全局平均池化
convolutional neural network
image classification
global average pooling