Understanding TensorFlow's computation graphs

TensorFlow relies on building a computation graph at its core, and it uses this computation graph to derive relationships between tensors from the input all the way to the output. Let's say, we have rank 0 (scalar) tensors a, b, and c and we want to evaluate Understanding TensorFlow's computation graphs. This evaluation can be represented as a computation graph, as shown in the following figure:

Understanding TensorFlow's computation graphs

As we can see, the computation graph is simply a network of nodes. Each node resembles an operation, which applies a function to its input tensor or tensors and returns zero or more tensors as the output.

TensorFlow builds this computation graph and uses it to compute the gradients accordingly. The individual steps for building and compiling such a computation graph in TensorFlow are as follows:

  1. Instantiate a new, empty computation graph.
  2. Add nodes (tensors and operations) to the computation graph.
  3. Execute the graph:
    1. Start a new session
    2. Initialize the variables in the graph
    3. Run the computation graph in this session

So let's create a graph for evaluating Understanding TensorFlow's computation graphs, as shown in the previous figure, where a, b, and c are scalars (single numbers). Here, we define them as TensorFlow constants. A graph can be created by calling tf.Graph(), then nodes can be added to it as follows:

>>> g = tf.Graph()
>>> 
>>> with g.as_default():
...     a = tf.constant(1, name='a')
...     b = tf.constant(2, name='b')
...     c = tf.constant(3, name='c')
...
...     z = 2*(a-b) + c

In this code, we added nodes to the g graph using with g.as_default(). If we do not explicitly create a graph, there is always a default graph, and therefore, all the nodes are added to the default graph. In this book, we try to avoid working with the default graph for clarity. This approach is especially useful when we are developing code in a Jupyter notebook, as we avoid piling up unwanted nodes in the default graph by accident.

A TensorFlow session is an environment in which the operations and tensors of a graph can be executed. A session object is created by calling tf.Session that can receive an existing graph (here, g) as an argument, as in tf.Session(graph=g); otherwise, it will launch the default graph, which might be empty.

After launching a graph in a TensorFlow session, we can execute its nodes; that is, evaluating its tensors or executing its operators. Evaluating each individual tensor involves calling its eval method inside the current session. When evaluating a specific tensor in the graph, TensorFlow has to execute all the preceding nodes in the graph until it reaches that particular one. In case there are one or more placeholders, they would need to be fed, as we'll see later in the next section.

Quite similarly, we can also use a session's run method to execute operations that do not have any return types. An example of such an operation was introduced in Chapter 13, Parallelizing Neural Network Training with TensorFlow, for building multilayer perceptrons for MNIST," namely train_op = optimizer.minimize(loss=cost).

Here, we will launch the previous graph in a TensorFlow session and evaluate the tensor z as follows:

>>> with tf.Session(graph=g) as sess:
...     print('2*(a-b)+c => ', sess.run(z))
2*(a-b)+c =>  1

Remember that we define tensors and operations in a computation graph context within TensorFlow. A TensorFlow session is then used to execute the operations in the graph and fetch and evaluate the results.

In this section, we saw how to define a computation graph, how to add nodes to it, and how to evaluate the tensors in a graph within a TensorFlow session. We'll now take a deeper look into the different types of nodes that can appear in a computation graph, including placeholders and variables. Along the way, we'll see some other operators that do not return a tensor as the output.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset