Now that we have our model binaries stored in /runs/ folder, we just need to write a restful API for which you can use flask and call the sentiment_engine() defined in the model_inference.py code.
Always make sure that you use the checkpoints of the best model and the correct embedding file, which is defined as :
checkpoint_dir = "./runs/1508847544/"
embedding = np.load('fasttext_embedding.npy')