Optimizing the loss

As we saw in vanilla GANs, we collect the parameters of the discriminator and generator as and respectively:

training_vars = tf.trainable_variables()
theta_D = [var for var in training_vars if 'dis' in var.name]
theta_G = [var for var in training_vars if 'gen' in var.name]

Optimize the loss using the Adam optimizer:

d_optimizer = tf.train.AdamOptimizer(learning_rate).minimize(D_loss, var_list=theta_D)
g_optimizer = tf.train.AdamOptimizer(learning_rate).minimize(G_loss, var_list=theta_G)
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset