In this recipe, we will normalize the outputs of the second fully connected layer using softmax activation such that each class has a (probability) value restricted between 0 and 1, and all the values across 10 classes add up to 1.
In this recipe, we will normalize the outputs of the second fully connected layer using softmax activation such that each class has a (probability) value restricted between 0 and 1, and all the values across 10 classes add up to 1.