Cost function

Next, we need to define our performance measure, which is the cross-entropy. The value of the cross-entropy will be 0 if the predicted class is correct:

cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=fc_layer_2,
labels=y_actual)

Next, we need to average all the cross-entropy values that we got from the previous step to be able to get a single performance measure over the test set:

model_cost = tf.reduce_mean(cross_entropy)

Now, we have a cost function that needs to be optimized/minimized, so we will be using AdamOptimizer, which is an optimization method like gradient descent but a bit more advanced:

model_optimizer = tf.train.AdamOptimizer(learning_rate=1e-4).minimize(model_cost)
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset