Forward propagation

Now we'll define the forward propagation operation. We'll use ReLU activations in all layers. In the last layers, we'll apply sigmoid activation, as shown in the following code:

with tf.name_scope('Model'):

with tf.name_scope('layer1'):
layer_1 = tf.nn.relu(tf.add(tf.matmul(X, weights['w1']), biases['b1']) )

with tf.name_scope('layer2'):
layer_2 = tf.nn.relu(tf.add(tf.matmul(layer_1, weights['w2']), biases['b2']))

with tf.name_scope('layer3'):
layer_3 = tf.nn.relu(tf.add(tf.matmul(layer_2, weights['w3']), biases['b3']))

with tf.name_scope('output_layer'):
y_hat = tf.nn.sigmoid(tf.matmul(layer_3, weights['out']) + biases['out'])
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset