期刊文献+

MU-GAN:Facial Attribute Editing Based on Multi-Attention Mechanism 被引量:6

下载PDF
导出
摘要 Facial attribute editing has mainly two objectives:1)translating image from a source domain to a target one,and 2)only changing the facial regions related to a target attribute and preserving the attribute-excluding details.In this work,we propose a multi-attention U-Net-based generative adversarial network(MU-GAN).First,we replace a classic convolutional encoder-decoder with a symmetric U-Net-like structure in a generator,and then apply an additive attention mechanism to build attention-based U-Net connections for adaptively transferring encoder representations to complement a decoder with attribute-excluding detail and enhance attribute editing ability.Second,a self-attention(SA)mechanism is incorporated into convolutional layers for modeling long-range and multi-level dependencies across image regions.Experimental results indicate that our method is capable of balancing attribute editing ability and details preservation ability,and can decouple the correlation among attributes.It outperforms the state-of-the-art methods in terms of attribute manipulation accuracy and image quality.Our code is available at https://github.com/SuSir1996/MU-GAN.
出处 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2021年第9期1614-1626,共13页 自动化学报(英文版)
基金 supported in part by the National Natural Science Foundation of China(NSFC)(62076093,61871182,61302163,61401154) the Beijing Natural Science Foundation(4192055) the Natural Science Foundation of Hebei Province of China(F2015502062,F2016502101,F2017502016) the Fundamental Research Funds for the Central Universities(2020YJ006,2020MS099) the Open Project Program of the National Laboratory of Pattern Recognition(NLPR)(201900051) The authors gratefully acknowledge the support of NVIDIA Corporation with the donation of the GPU used for this research.
  • 相关文献

参考文献2

共引文献58

同被引文献50

引证文献6

二级引证文献10

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部