How it works...

We added layers to the network by calling the layer() method as mentioned in step 2. Input layers are added using DenseLayerAlso, we need to add an activation function for the input layer. We specified the activation function by calling the activation() method. We discussed activation functions in Chapter 1, Introduction to Deep Learning in Java. You can use one of the available activation functions in DL4J to the activation() method. The most generic activation function used is RELU. Here are roles of other methods in layer design:

  • nIn(): This refers to the number of inputs for the layer. For an input layer, this is nothing but the number of input features.
  • nOut()This refers to number of outputs to next dense layer in neural network.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset