期刊文献+

基于空间特征的生成对抗网络数据生成方法 被引量:5

Data Generation Based on Generative Adversarial Network with Spatial Features
下载PDF
导出
摘要 传统的生成对抗网络(GAN)在特征图较大的情况下,忽略了原始特征的表示和结构信息,并且生成图像的像素之间缺乏远距离相关性,从而导致生成的图像质量较低。为了进一步提高生成图像的质量,该文提出一种基于空间特征的生成对抗网络数据生成方法(SF-GAN)。该方法首先将空间金字塔网络加入生成器和判别器,来更好地捕捉图像的边缘等重要的描述信息;然后将生成器和判别器进行特征加强,来建模像素之间的远距离相关性。使用CelebA,SVHN,CIFAR-10等小规模数据集进行实验,通过定性和盗梦空间得分(IS)、弗雷歇距离(FID)定量评估证明了所提方法相比梯度惩罚生成对抗网络(WGAN-GP)、自注意力生成对抗网络(SAGAN)能使生成的图像具有更高的质量。并且通过实验证明了该方法生成的数据能够进一步提升分类模型的训练效果。 Traditional Generative Adversarial Network(GAN)ignores the representation and structural information of the original feature when the feature map is large,and there is no remote correlation between the pixels of the generated images,resulting image quality is low.To improve the quality of the generated images further,a method of data generation based on Generative Adversarial Network with Spatial Features(SF-GAN)is proposed.Firstly,the spatial pyramid network is added into the generator and discriminator to capture the important description information better such as the edge of the images.Then the features of the generator and discriminator are strengthened to model the remote correlation between pixels.Experiments are performed with small-scale benchmarks(CelebA,SVHN,and CIFAR-10).Compared with improved training of Wasserstein GANs(WGAN-GP)and Self-Attention Generative Adversarial Networks(SAGAN)by qualitative and quantitative evaluation of Inception Score(IS)and Frechet Inception Distance(FID),the proposed method can generate higher quality images.The experiment proves that the generated images can improve the training effect of the classified model further.
作者 孙磊 杨宇 毛秀青 汪小芹 李佳欣 SUN Lei;YANG Yu;MAO Xiuqing;WANG Xiaoqin;LI Jiaxin(PLA Strategic Support Force Information Engineering University,Zhengzhou 450001,China)
出处 《电子与信息学报》 EI CSCD 北大核心 2023年第6期1959-1969,共11页 Journal of Electronics & Information Technology
关键词 生成对抗网络 空间金字塔网络 特征加强 特征图 Generative Adversarial Network(GAN) Spatial pyramid network Feature strengthen Feature maps
  • 相关文献

参考文献3

二级参考文献47

  • 1叶世伟,史忠植.神经网络原理[M].北京:机械工业出版社,2006.
  • 2Haykin S. Neural Networks and Learning Machines (3rd Edition) [M]. New Jersey: Pearson Education, 2009.
  • 3Hinton G E, Sejnowski T J. Learning and relearning in Boltzmann machines[C]// Parallel Distributed Processing: Explorations in the Microstructure of Cognition,Cambridge, USA, 1986.
  • 4Smolensky P. Information processing in dynamical systems: foundations of harmony theory[C]// Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Cambridge, USA, 1986.
  • 5Freund Y, Haussler D. Unsupervised learning of distributions on binary vectors using two layer networks[R]. Santa Cruz: University of California, UCSC-CRL-94-25, 1994.
  • 6Roux N L, Bengio Y. Representational power of restricted Boltzmann machines and deep belief networks[J]. Neural Computation, 2008,20(6): 1631-1649.
  • 7Hinton G E. Training products of experts by minimizing contrastive divergence [J]. Neural Computation, 2002, 14(8): 1771-1800.
  • 8Cho K Y. Improved learning algorithms for restricted Boltzmann machines[D]. Espoo: Aalto University,2011.
  • 9Teh Y W, Hinton G E. Rate-coded restricted Boltzmann machines for face recognition[C]// Advances in Neural Information Processing Systems 13, MIT Press, 2001: 908-914.
  • 10Salakhutdinov R, Mnih A, Hinton G E. Restricted Boltzmann machines for collaborative filtering[C]// Proceedings of the 24th International Conference on Machine Learning, Corvallis, OR, 2007: 791-798.

共引文献136

同被引文献30

引证文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部