Summary of the discriminator network

The summary of the discriminator network shows the output shape and number of parameters for each layer:

# Summary of discriminator network model 
summary(d)
___________________________________________________
Layer (type) Output Shape Param #
===================================================
input_10 (InputLayer) [(None, 28, 28, 1)] 0
___________________________________________________
conv2d_12 (Conv2D) (None, 25, 25, 64) 1088
____________________________________________________
leaky_re_lu_17 (LeakyReLU) (None, 25, 25, 64) 0
____________________________________________________
flatten_2 (Flatten) (None, 40000) 0
____________________________________________________
dropout_2 (Dropout) (None, 40000) 0
____________________________________________________
dense_7 (Dense) (None, 1) 40001
====================================================
Total params: 41,089
Trainable params: 41,089
Non-trainable params: 0
_____________________________________________________

Here, the output of the first layer is 28 x 28 x 1 in size, which matches the dimensions of the fake and real images. The total number of parameters is 41,089.

Now, we can compile the discriminator network model using the following code:

# Compile discriminator network
d %>% compile(optimizer = 'rmsprop',
loss = "binary_crossentropy")

Here, we have compiled the discriminator network using the rmsprop optimizer. For the loss, we have specified binary_crossentropy.

Next, we freeze the weight of the discriminator network. Note that we freeze these weights after compiling the discriminator network so that it applies them to the gan model only:

# Freeze weights and compile
freeze_weights(d)
gani <- layer_input(shape = l)
gano <- gani %>% g %>% d
gan <- keras_model(gani, gano)
gan %>% compile(optimizer = 'rmsprop',
loss = "binary_crossentropy")

Here, the generative adversarial network's output (gano) uses the generator network and the discriminator network with frozen weights. The generative adversarial network (gan) is based on gani and gano. The network is then compiled with the rmsprop optimizer and with the loss specified as binary_crossentropy.

Now, we are ready to train the network.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset