Introducing TensorBoard

TensorBoard is TensorFlow's visualization tool, which can be used to visualize a computational graph. It can also be used to plot various quantitative metrics and the results of several intermediate calculations. When we are training a really deep neural network, it becomes confusing when we have to debug the network. So, if we can visualize the computational graph in TensorBoard, we can easily understand such complex models, debug them, and optimize them. TensorBoard also supports sharing.

As shown in the following screenshot, the TensorBoard panel consists of several tabs—SCALARS, IMAGES, AUDIO, GRAPHS, DISTRIBUTIONS, HISTOGRAMS, and EMBEDDINGS:

The tabs are pretty self-explanatory. The SCALARS tab shows useful information about the scalar variables we use in our program. For example, it shows how the value of a scalar variable called loss changes over several iterations.

The GRAPHS tab shows the computational graph. The DISTRIBUTIONS and HISTOGRAMS tabs show the distribution of a variable. For example, our model's weight distribution and histogram can be seen under these tabs. The EMBEDDINGS tab is used for visualizing high-dimensional vectors, such as word embeddings (we will learn about this in detail in Chapter 7, Learning Text Representations).

Let's build a basic computational graph and visualize it in TensorBoard. Let's say we have four variables, shown as follows:

x = tf.constant(1,name='x')
y = tf.constant(1,name='y')
a = tf.constant(3,name='a')
b = tf.constant(3,name='b')

Let's multiply x and y and a and b and save them as prod1 and prod2, as shown in the following code:

prod1 = tf.multiply(x,y,name='prod1')
prod2 = tf.multiply(a,b,name='prod2')

Add prod1 and prod2 and store them in sum:

sum = tf.add(prod1,prod2,name='sum')

Now, we can visualize all of these operations in TensorBoard. In order to visualize in TensorBoard, we first need to save our event files. It can be done using tf.summary.FileWriter(). It takes two important parameters, logdir and graph.

As the name suggests, logdir specifies the directory where we want to store the graph, and graph specifies which graph we want to store:

with tf.Session() as sess:
writer = tf.summary.FileWriter(logdir='./graphs',graph=sess.graph)
print(sess.run(sum))

In the preceding code, graphs is the directory where we are storing our event file, and sess.graph specifies the current graph in our TensorFlow session. So, we are storing the current graph in the TensorFlow session in the graphs directory.

To start TensorBoard, go to your Terminal, locate the working directory, and type the following:

tensorboard --logdir=graphs --port=8000

The logdir parameter indicates the directory where the event file is stored and port is the port number. Once you run the preceding command, open your browser and type http://localhost:8000/.

In the TensorBoard panel, under the GRAPHS tab, you can see the computational graph:

As you may notice, all of the operations we have defined are clearly shown in the graph.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset