This work introduces an optimal transportation(OT)view of generative adversarial networks(GANs).Natural datasets have intrinsic patterns,which can be summarized as the manifold distribution principle:the distribution ...This work introduces an optimal transportation(OT)view of generative adversarial networks(GANs).Natural datasets have intrinsic patterns,which can be summarized as the manifold distribution principle:the distribution of a class of data is close to a low-dimensional manifold.GANs mainly accomplish two tasks:manifold learning and probability distribution transformation.The latter can be carried out using the classical OT method.From the OT perspective,the generator computes the OT map,while the discriminator computes the Wasserstein distance between the generated data distribution and the real data distribution;both can be reduced to a convex geometric optimization process.Furthermore,OT theory discovers the intrinsic collaborative-instead of competitive-relation between the generator and the discriminator,and the fundamental reason for mode collapse.We also propose a novel generative model,which uses an autoencoder(AE)for manifold learning and OT map for probability distribution transformation.This AE–OT model improves the theoretical rigor and transparency,as well as the computational stability and efficiency;in particular,it eliminates the mode collapse.The experimental results validate our hypothesis,and demonstrate the advantages of our proposed model.展开更多
基金the National Natural Science Foundation of China(61936002,61772105,61432003,61720106005,and 61772379)US National Science Foundation(NSF)CMMI-1762287 collaborative research“computational framework for designing conformal stretchable electronics,Ford URP topology optimization of cellular mesostructures’nonlinear behaviors for crash safety,”NSF DMS-1737812 collaborative research“ATD:theory and algorithms for discrete curvatures on network data from human mobility and monitoring.”。
文摘This work introduces an optimal transportation(OT)view of generative adversarial networks(GANs).Natural datasets have intrinsic patterns,which can be summarized as the manifold distribution principle:the distribution of a class of data is close to a low-dimensional manifold.GANs mainly accomplish two tasks:manifold learning and probability distribution transformation.The latter can be carried out using the classical OT method.From the OT perspective,the generator computes the OT map,while the discriminator computes the Wasserstein distance between the generated data distribution and the real data distribution;both can be reduced to a convex geometric optimization process.Furthermore,OT theory discovers the intrinsic collaborative-instead of competitive-relation between the generator and the discriminator,and the fundamental reason for mode collapse.We also propose a novel generative model,which uses an autoencoder(AE)for manifold learning and OT map for probability distribution transformation.This AE–OT model improves the theoretical rigor and transparency,as well as the computational stability and efficiency;in particular,it eliminates the mode collapse.The experimental results validate our hypothesis,and demonstrate the advantages of our proposed model.