期刊文献+

轻量化双通道图像语义分割模型 被引量:1

A Lightweight Two-channel Image Semantic Segmentation Model
下载PDF
导出
摘要 深层神经网络兴起后,语义分割算法模型在常用的室内外分割数据集上取得了很高的分割精度。当今模型多为编解码结构,通过多层下采样的卷积网络对原图像进行编码,然后通过对称或非对称的解码网络还原图像尺寸,输出语义分割图,虽然可以提高分割精度,但模型复杂度高,参数量大,难以在汽车、手机等移动设备上使用。在保证分割精度的前提上,模型参数量降低,分割效率提升。为了保证分割精度,图像的特征提取分空间特征提取和上下文特征提取两条路径,将不同类型特征图融合后上采样还原;同时使用1×1卷积减少卷积通道,极大降低了参数量。在camvid数据集上测试,模型参数量仅为5.8M,并取得了58.6%的MIOU,保证分割精度的前提下,大程度降低参数量。 After the rise of deep neural networks,semantic segmentation algorithm models have achieved high segmentation accuracy on commonly used indoor and outdoor segmentation datasets.Most of today's models are encoder-decoder structures.The original image is encoded through a multi-layer down-sampling convolutional network,and then the image size is restored through a symmetric or asymmetric decoding network to output a semantic segmentation map.Although the segmentation accuracy can be improved,the model is complex.It is difficult to be used on mobile devices such as cars and mobile phones because of its large amount of parameters.On the basis of ensuring the segmentation accuracy,the model in this paper reduces the amount of model parameters and improves the segmentation efficiency.In order to ensure the segmentation accuracy,the feature extraction of the image is divided into two paths:spatial feature extraction and contextual feature extraction.Different types of feature maps are fused and then restored by up-sampling.At the same time,1*1 convolution is used to reduce the convolution channel,which greatly reduces the amount of parameters.Tested on the camvid dataset,the number of model parameters is only 5.8M,and the MIOU of 58.6%is obtained.Under the premise of ensuring the segmentation accuracy,the number of parameters is greatly reduced.
作者 姬壮伟 JI Zhuang-wei(Department of Computer Science,Changzhi University,Changzhi Shanxi,046011)
出处 《山西大同大学学报(自然科学版)》 2022年第5期6-8,共3页 Journal of Shanxi Datong University(Natural Science Edition)
关键词 语义分割 参数量 感受野 semantic segmentation parameter amount receptive field
  • 相关文献

参考文献5

二级参考文献15

共引文献86

同被引文献6

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部