How to do it...

  1. Evaluate the current performance of each image using the cross entropy function in TensorFlow. As the cross entropy function in TensorFlow internally applies softmax normalization, we provide the output of the fully connected layer post dropout (layer_fc2_drop) as an input along with true labels (y_true):
cross_entropy = tf$nn$softmax_cross_entropy_with_logits(logits=layer_fc2_drop, labels=y_true)

In the current cost function, softmax activation function is embedded thus the activation function is not required to be defined separately.

  1. Calculate the average of the cross entropy, which needs to be minimized using an optimizer:
cost = tf$reduce_mean(cross_entropy)
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset