Year of Graduation
Importance Sampling for Generative Adversarial Networks
Generative adversarial network is strong tool for data generation purposes. This model uses two networks of a generator and discriminator and it reduces Jensen Shanon distance between distribution of the generated data and the distribution of real data on each iteration. This leads to a case of strongly overlapping distributions which slows learning speed of model as a lot of observations have already reached the correct solution in the intersection but the discriminator gets big value of loss function and continues to change parameters. To solve this problem a novel technique is introduced. In this work it is demonstrated that by combining samples into pairs and applying importance sampling technique, that change the data distribution law to decrease variance of gradient evaluation, provides efficient solution for this problem. The result is confirmed on different cases beginning with simple 1d GAN on two Gaussian distributions and concluding with image classification on MNIST dataset.