Defining the generator

As we learned, the generator performs the transpose convolutional operation. The generator is composed of convolutional transpose and batch norm layers with ReLU activations. We apply batch normalization to every layer except for the last layer. Also, we apply ReLU activations to every layer, but for the last layer, we apply the tanh activation function to scale the generated image between -1 and +1:

def generator(z, z_dim, batch_size, is_training=False, reuse=False):
with tf.variable_scope('generator', reuse=reuse):

First fully connected layer:

        input_to_conv = tf.layers.dense(z, 8*8*128)

Convert the shape of the input and apply batch normalization followed by ReLU activations:

        layer1 = tf.reshape(input_to_conv, (-1, 8, 8, 128))
layer1 = tf.layers.batch_normalization(layer1, training=is_training, name='batch_normalization1')
layer1 = tf.nn.relu(layer1, name='relu1')

The second layer, that is the transpose convolution layer, with batch normalization and the ReLU activation:

        layer2 = tf.layers.conv2d_transpose(layer1, filters=256, kernel_size=5, strides= 2, padding='same', 
kernel_initializer=kernel_init, name='deconvolution2')
layer2 = tf.layers.batch_normalization(layer2, training=is_training, name='batch_normalization2')
layer2 = tf.nn.relu(layer2, name='relu2')

Define the third layer:

        layer3 = tf.layers.conv2d_transpose(layer2, filters=256, kernel_size=5, strides= 2, padding='same', 
kernel_initializer=kernel_init, name='deconvolution3')
layer3 = tf.layers.batch_normalization(layer3,training=is_training, name='batch_normalization3')
layer3 = tf.nn.relu(layer3, name='relu3')

Define the fourth layer:

        layer4 = tf.layers.conv2d_transpose(layer3, filters=256, kernel_size=5, strides= 1, padding='same', 
kernel_initializer=kernel_init, name='deconvolution4')
layer4 = tf.layers.batch_normalization(layer4,training=is_training, name='batch_normalization4')
layer4 = tf.nn.relu(layer4, name='relu4')

In the final layer, we don't apply batch normalization and instead of ReLU, we use tanh activation:

        layer5 = tf.layers.conv2d_transpose(layer4, filters=3, kernel_size=7, strides=1, padding='same', 
kernel_initializer=kernel_init, name='deconvolution5')


logits = tf.tanh(layer5, name='tanh')

return logits

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset