Literature on and related to memory networks

For more curious readers, here is a list of papers introducing new ideas and architectures related to or inspired by memory networks for natural language understanding:

Title

Description

ArXiv URL

Dynamic Memory Networks (DMNs) and Dynamic Coattention Networks (DCNs)

Introduced by Salesforce Research at around the same time as Facebook's memory networks, DMNs use more sophisticated RNNs for representation building and iterating over episodic memory. DCNs are Salesforce's latest iteration of attention-based reasoning models with a novel coattention mechanism.

https://arxiv.org/abs/1506.07285

https://arxiv.org/abs/1711.00106

Neural Turing Machines (NTMs) and Differentiable Neural Computer (DNC)

DeepMind's NTMs and DNC set themselves more enthusiastic goals: to make neural networks that can read and write to an external storage and execute any algorithms that computers can.  

https://arxiv.org/abs/1410.5401

https://www.nature.com/articles/nature20101

Seq2seq Memory Network

Microsoft Research introduced a seq2seq model for dialog generation, which they augmented with a memory module very similar to memory networks.

https://arxiv.org/pdf/1702.01932.pdf

Recurrent Entity Networks

Facebook's latest iteration of attention-based models, which can build memory and reason over it on the fly, as opposed to explicitly in the case of memory networks.

https://arxiv.org/pdf/1612.03969.pdf

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset