期刊文献+

结合Shortcut Connections结构的卷积稀疏编码图像去噪算法 被引量:4

Convolution Sparse Coding Image Denoising Algorithm Combined with Shortcut Connections Structure
下载PDF
导出
摘要 卷积稀疏编码网络模型(convolutional sparse coding network,CSCNet)虽然能够有效解决去噪问题,但是该算法并没有考虑到迭代求解近似编码向量过程中卷积层、反卷积层之间的叠加会改变原始数据分布方式。为解决该问题,借鉴深度学习领域常用方法对原始模型进行改进。讨论了在CSCNet模型中加入以及不加入批处理标准化(batch normalization,BN)、非线性激活函数、残差学习(residual learning,RL)对模型图像去噪效果的影响,然后再此基础上分别设计了两个不同的网络模型结构。为使输入数据分布方式不因模型层与层之间传播而改变,模型1是在原始CSCNet网络的每一层加入非线性激活函数以及BN层。CSCNet模型中所训练的卷积核都是同样大小,为增加图像特征的多样性,模型2在模型1基础之上加入了简单残差块结构改变了原始模型参数传播方式,并将其通过Shortcut Connections结构与原始输入联结起来。从实验结果可以看出,在不降低原始模型计算效率的前提下,使用文中设计的模型所得去噪后的结果相比原卷积稀疏编码网络略有提升。 Although the convolutional sparse coding network(CSCNet)model can effectively solve the denoising problem,the algorithm does not take into account that the superposition between the convolutional layer and the deconvolutional layer in the process of iteratively solving the approximate coding vector will change the original data distribution method.In order to solve this problem,the article draws on common methods in the deep learning field to improve the original model.The effects of adding and not adding batch normalization,BN),nonlinear activation function and residual learning,RL)to CSCNet model on image denoising were discussed,and then two different network model structures were designed respectively.In order to make the input data distribution mode not change due to the propagation between model layers,Model 1 added a nonlinear activation function and batch normalization layer to each layer of the original CSCNet network.The convolution kernels trained in CSCNet model were all of the same size.In order to increase the diversity of image features,Model 2 added a simple residual block structure to Model 1 to change the propagation mode of the original model parameters,and connects them with the original input through Shortcut Connections structure.It can be seen from the experimental results that,without reducing the computational efficiency,the denoising result obtained by using the model designed is slightly better than the original convolutional sparse coding network.
作者 张膑 张运杰 白明明 ZHANG Bin;ZHANG Yun-jie;BAI Ming-ming(School of Science, Dalian Maritime University, Dalian 116026, China)
出处 《科学技术与工程》 北大核心 2021年第26期11253-11262,共10页 Science Technology and Engineering
关键词 稀疏编码 卷积稀疏编码 批处理标准化 残差学习 sparse coding convolutional sparse coding batch normalization residual learning
  • 相关文献

参考文献6

二级参考文献26

共引文献44

同被引文献29

引证文献4

二级引证文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部