摘要
人脸图像性别转移属于图像风格迁移问题的特例,运用一般的生成对抗网络模型往往不能对人脸部分进行高质量迁移,且无关背景域常常出现扭曲模糊现象,人脸肤色也不能保持原颜色。针对上述问题,本文在基于改进MUNIT的人脸图像性别转换模型的基础上,提出具有鲁棒性质的人脸图像性别转移模型。首先对输入模型的人脸图像进行人脸解析(Face Parsing),准确将图像中的人脸部分输入到模型中进行训练学习,以解决图像中无关背景域对模型训练的影响;其次构造新的损失函数,将模型生成前后的人脸部分做基于颜色的直方图匹配(Histogram Matching),从而将人脸性别转移前后的肤色保持一致;最后对公开人脸数据集CeleBA进行属性筛选,以减少人脸遮挡,眼镜等影响模型训练的不利因素,从而提高生成图像的质量。实验结果表明,与其他经典算法相比,本文所提方法可以有效保留图像背景区域以及人脸肤色,并生成效果更好的人脸性别转移图像。
Gender transfer of face image is a special case of image style transfer problem. The use of the gen-eral generative adversity-network model often cannot transfer the face part of high quality, and the irrelevant background domain often appears distorted and fuzzy, and the face skin color can not maintain the original appearance. To solve these problems, this paper proposes a robust face image gender transfer model based on MUNIT’s improved face image gender transfer model. Firstly, Face Parsing was performed on the face image input to the model, and the face part of the image was accurately input to the model for training and learning, so as to solve the influence of the irrelevant background domain on the model training. Secondly, a new loss function was constructed to perform color based on Histogram Matching on faces before and after the generation of the model, so as to ensure the consistency of skin color before and after face gender transfer. Finally, attribute screening was carried out on the public face data set CeleBA to reduce the adverse factors affecting model training such as face occlusion and glasses, so as to improve the quality of the generated images. The experimental results show that, compared with other classical algorithms, the proposed method can effectively preserve the background area of the image and human skin color, and generate better facial gender transfer images.
出处
《计算机科学与应用》
2023年第2期191-203,共13页
Computer Science and Application