Simple implementation of GANs

From the story of faking a ticket to an event, the idea of GANs seems to be very intuitive. So to get a clear understanding of how GANs work and how to implement them, we are going to demonstrate a simple implementation of a GAN on the MNIST dataset.

First, we need to build the core of the GAN network, which is comprised of two major components: the generator and the discriminator. As we said, the generator will try to imagine or fake data samples from a specific probability distribution; the discriminator, which has access to and sees the actual data samples, will judge whether the generator's output has any flaws in the design or it's very close to the original data samples. Similar to the scenario of the event, the whole purpose of the generator is to try to convince the discriminator that the generated image is from the real dataset and hence try to fool him.

The training process has a similar end to the event story; the generator will finally manage to generate images that look very similar to the original data samples:

Figure 2: GAN general architecture for the MNIST dataset

The typical structure of any GAN is shown in Figure 2, which will be trained on the MNIST dataset. The Latent sample part in this figure is a random thought or vector that the generator uses to replicate the real images with fake ones.

As we mentioned, the discriminator works as a judge and it will try to separate the real images from the fake ones that were designed by the generator. So the output of this network will be binary, which can be represented by a sigmoid function with 0 (meaning the input is a fake image) and 1 (meaning that the input is a real image).

Let's go ahead and start implementing this architecture to see how it performs on the MNIST dataset.

Let's start of by importing the required libraries for this implementation:

%matplotlib inline

import matplotlib.pyplot as plt
import pickle as pkl

import numpy as np
import tensorflow as tf

We will be using the MNIST dataset, so we are going to use TensorFlow helpers to get the dataset and store it somewhere:

from tensorflow.examples.tutorials.mnist import input_data
mnist_dataset = input_data.read_data_sets('MNIST_data')
Output:
Extracting MNIST_data/train-images-idx3-ubyte.gz
Extracting MNIST_data/train-labels-idx1-ubyte.gz
Extracting MNIST_data/t10k-images-idx3-ubyte.gz
Extracting MNIST_data/t10k-labels-idx1-ubyte.gz
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset