Putting it together

We can then write an inference method to bring together the various modules into a single pipeline for reading inputs and questions, obtaining context vectors, and producing an output:

    def _inference(self, facts, questions):
with tf.variable_scope("MemoryNetwork"):
input_vectors = self._input_module(facts)
question_vectors = self._question_module(questions)
context_vectors = self._memory_module(question_vectors,
input_vectors)
output = self._output_module(context_vectors)
return output

Lastly, we define the fit and predict functions, which can be used to train and make predictions using the memory network as part of a larger pipeline. We use a feed_dict to pass data into the operations that we had defined in the initialization code, which in turn will run the _inference function:

    def fit(self, facts, questions, answers):
feed_dict = {self._facts: facts,
self._questions: questions,
self._answers: answers}
loss, _ = self._session.run([self.loss_op, self.train_op],
feed_dict=feed_dict)
return loss

def predict(self, facts, questions):
feed_dict = {self._facts: facts, self._questions: questions}
return self._session.run(self.predict_op, feed_dict=feed_dict)
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset