Model optimizer

Finally, before training our model, we need to implement the optimization criteria for this task. We will use the naming conventions that we used previously to retrieve the trainable parameters for the discriminator and the generator and train them:

# specifying the optimization criteria
def model_optimizer(disc_loss, gen_loss, learning_rate, beta1):
trainable_vars = tf.trainable_variables()
disc_vars = [var for var in trainable_vars if var.name.startswith('discriminator')]
gen_vars = [var for var in trainable_vars if var.name.startswith('generator')]

disc_train_opt = tf.train.AdamOptimizer(
learning_rate, beta1=beta1).minimize(disc_loss, var_list=disc_vars)

update_operations = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
gen_updates = [opt for opt in update_operations if opt.name.startswith('generator')]

with tf.control_dependencies(gen_updates):
gen_train_opt = tf.train.AdamOptimizer(
learning_rate, beta1).minimize(gen_loss, var_list=gen_vars)

return disc_train_opt, gen_train_opt
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset