Getting output from TensorFlow

In the previous section, we knew how to build a computational graph, but we need to actually run it and get its value.

We can deploy/run the graph with something called a session, which is just a binding to a particular execution context such as a CPU or a GPU. So, we are going to take the graph that we build and deploy it to a CPU or a GPU context.

To run the graph, we need to define a session object called sess, and we are going to call the function run which takes two arguments:

sess.run(fetches, feeds)

Here:

  • fetches are the list of the graph nodes that return the output of the nodes. These are the nodes we are interested in computing the value of.
  • feeds are going to be a dictionary mapping from graph nodes to actual values that we want to run in our model. So, this is where we actually fill in the placeholders that we talked about earlier.

So, let's go ahead and run our graph:

# importing the numpy package for generating random variables for
# our placeholder x
import numpy as np
# build a TensorFlow session object which takes a default execution
# environment which will be most likely a CPU
sess = tf.Session()
# calling the run function of the sess object to initialize all the
# variables.
sess.run(tf.global_variables_initializer())
# calling the run function on the node that we are interested in,
# the h, and we feed in our second argument which is a dictionary
# for our placeholder x with the values that we are interested in.
sess.run(h, {x: np.random.random((100,784))})

After running our graph through the sess object, we should get an output similar to the following:

As you can see, in the second line of the above code snippet, we initialized our variables, and this is a concept in TensorFlow which is called lazy evaluation. It means that the evaluation of your graph only ever happens at runtime, and runtime in TensorFlow means the session. So, calling this function, global_variables_initializer(), will actually initialize anything called variable in your graph, such as W and b in our case.

We can also use the session variable in a with block to ensure that it will be closed after executing the graph:

ph_var1 = tf.placeholder(tf.float32,shape=(2,3))
ph_var2 = tf.placeholder(tf.float32,shape=(3,2))
result = tf.matmul(ph_var1,ph_var2) with tf.Session() as sess: print(sess.run([result],feed_dict={ph_var1:[[1.,3.,4.],[1.,3.,4.]],ph_var2:[[1., 3.],[3.,1.],[.1,4.]]}))

Output:
[array([[10.4, 22. ],
[10.4, 22. ]], dtype=float32)]
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset