Optimizing the loss

We need to optimize our generator and discriminator. So, we collect the parameters of the discriminator and generator as theta_D and theta_G respectively:

training_vars = tf.trainable_variables()
theta_D = [var for var in training_vars if var.name.startswith('discriminator')]
theta_G = [var for var in training_vars if var.name.startswith('generator')]

Optimize the loss using the Adam optimizer:

learning_rate = 0.001

D_optimizer = tf.train.AdamOptimizer(learning_rate, beta1=0.5).minimize(D_loss,
var_list=theta_D)
G_optimizer = tf.train.AdamOptimizer(learning_rate, beta1=0.5).minimize(G_loss,
var_list=theta_G)
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset