Quick-thoughts for sentence embeddings

Quick-thoughts is another interesting algorithm for learning the sentence embeddings. In skip-thoughts, we saw how we used the encoder-decoder architecture to learn the sentence embeddings. In quick-thoughts, we try to learn whether a given sentence is related to the candidate sentence. So, instead of using a decoder, we use a classifier to learn whether a given input sentence is related to the candidate sentence.

Let be the input sentence and be the set of candidate sentences containing both valid context and invalid context sentences related to the given input sentence . Let be any candidate sentence from the .

We use two encoding functions, and . The role of these two functions, and , is to learn the embeddings, that is, to learn the vector representations of a given sentence and candidate sentence , respectively.

Once these two functions generate the embeddings, we use a classifier , which returns the probability for each candidate sentence to be related to the given input sentence.

As shown in the following figure, the probability of the second candidate sentence is high, as it is related to the given input sentence :

Thus, the probability that is a correct sentence ,that is, is related to the given input sentence is computed as:

Here, is a classifier.

The goal of our classifier is to identify the valid context sentence related to a given input sentence . So, our cost function is to maximize the probability of finding the correct context sentence for the given input sentence . If it classifies the sentence correctly, then it means that our encoders learned the better representation of the sentence.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset