Putting it all together

To run everything we have just coded, we will define our model hyperparameters and instantiate our chatbot model. We will then proceed to start training the model for 200 epochs, evaluating its performance on the validation set every 10 epochs. After training, we can test the model on the testing data as follows:

chatbot = ChatBotWrapper(train_data, test_data, val_data,
candidates, candidates_to_idx,
memory_size=50,
batch_size=32,
learning_rate=0.001,
evaluation_interval=10,
hops=3,
epochs=100,
embedding_size=50)
chatbot.train()
chatbot.test()

The following is the output:

Epoch: 10
Total Cost: 17703.9733608
Training Accuracy: 0.756870229008
Validation Accuracy: 0.729912770223
------------------------------------------------
Epoch: 20
Total Cost: 7439.67566451
Training Accuracy: 0.903217011996
Validation Accuracy: 0.857127377147
------------------------------------------------
Epoch: 30
Total Cost: 3179.78263753
Training Accuracy: 0.982769901854
Validation Accuracy: 0.939372595763
.
.
.
------------------------------------------------
Epoch: 80
Total Cost: 1949.99280906
Training Accuracy: 0.980861504907
Validation Accuracy: 0.937747196186
------------------------------------------------
Epoch: 90
Total Cost: 500.894205613
Training Accuracy: 0.995637949836
Validation Accuracy: 0.95400119196
------------------------------------------------
Epoch: 100
Total Cost: 912.067172846
Training Accuracy: 0.995092693566
Validation Accuracy: 0.954813891748
-------------------------------------------------
Testing Accuracy: 0.958093271008

As we train our chatbot, we can evaluate its performance on the validation data and should see its loss decrease and accuracy increase. At the end of training, we end up having a model with decent performance on the test set, although it can be even better if we employ more rigorous regularization schemes, such as gradient clipping, L2 norm regularization, or dropout. Adding them is fairly straightforward using TensorFlow, and is left as an exercise to the reader.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset