期刊文献+

基于深度残差网络的轻量级生成图像压缩方法 被引量:1

A lightweight generative image compression method based on deep residual network
下载PDF
导出
摘要 基于深度学习的生成图像压缩方法具有在低码率场景下重建图像视觉保真度高的优点,但该方法网络参数量大,所需计算资源和模型占用内存空间多,因此,无法得到有效应用。针对该问题,采用深度轻量级残差块优化残差网络以精简解码网络。首先将两层残差网络替换为三层轻量级残差块,增加网络深度从而更直观提取图像的抽象特征,使得生成器更好地模拟真实样本的分布。其次在每个残差块中再增加一个非线性激活与归一化层,缓解了网络训练时的梯度消失问题。实验结果表明,基于深度残差网络的生成图像压缩方法在率失真性能与感知指标上评估结果与已有方法基本一致,但网络参数量减少了30.44%且模型训练耗时降低33.3%,模型大小减少25%。 The generative image compression method based on deep learning has the advantage of high fidelity of reconstructing images at low bit rates.But this method has a large amount of network parameters,and the required computing resources and models occupy a large amount of memory space,so it cannot be effectively applied.To solve this problem,a deep lightweight residual block was used to optimize the residual network to simplify the decoding network.First,the two-layer residual network is replaced with a three-layer lightweight residual block.Increasing the depth of the network to more intuitively extract abstract features of images allows the generator to better simulate the distribution of real samples.Secondly,non-linear activation and normalization layers were added to each residual block,which alleviates the problem of gradient disappearance during network training.The results show that the proposed method in rate-distortion performance and perceptual indicators are basically consistent with the existing methods.However,the amount of our network parameters is reduced by 30.44%and the model training time and required resources are reduced by nearly one-third.
作者 闫雪 祝启斌 陈菊霞 夏巧桥 YAN Xue;ZHU Qibin;CHEN Juxia;XIA Qiaoqiao(School of Physical Science and Technology,Central China Normal University,Wuhan 430079,China)
出处 《激光杂志》 CAS 北大核心 2022年第9期76-82,共7页 Laser Journal
基金 国家自然科学基金(No.62101204) 湖北省自然科学基金(No.2020CFB474) 中央高校基本科研业务费专项资金资助(No.CCNU20ZT002)。
关键词 生成图像压缩 低码率 残差网络 轻量级 深度学习 generative image compression low bit rate residual network lightweight deep learning
  • 相关文献

参考文献1

共引文献6

同被引文献11

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部