How it works...

In step 1, we specify the default seed values, the initial default weights (weightInit), the weight updater, and so on. We set the gradient normalization strategy to ClipElementWiseAbsoluteValue. We have also set the gradient threshold to 0.5 as an input to the gradientNormalization strategy.

The neural network calculates the gradients across neurons at each layer. We normalized the input data earlier in the Normalizing training data recipe, using a normalizer. It makes sense to mention that we need to normalize the gradient values to achieve data preparation goals. As we can see in step 1, we have used ClipElementWiseAbsoluteValue gradient normalization. It works in such a way that the absolute value of the gradient cannot be greater than the threshold. For example, if the gradient threshold value is 3, then the value range would be [-3, 3]. Any gradient values that are less than -5 would be treated as -3 and any gradient values that are higher than 3 would be treated as 3. Gradient values in the range [-3, 3] will be unmodified. We have mentioned the gradient normalization strategy as well as the threshold in the network configuration, as shown here:

neuralNetConfigBuilder.gradientNormalization(GradientNormalization.ClipElementWiseAbsoluteValue);
neuralNetConfigBuilder.gradientNormalizationThreshold(thresholdValue);

In step 3, the trainFeatures label is referred to the input layer label. The inputs are basically the graph vertex objects returned by the graphBuilder() method. The specified LSTM layer name (L1 in our example) in step 2 will be used while configuring the output layer. If there's a mismatch, our program will throw an error during execution saying that the layers are configured in such a way that they are disconnected. We will discuss this in more depth in the next recipe, when we design output layers for the neural network. Note that we have yet to add output layers in the configuration.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset