On the other hand, the skip-gram model does the inverse of the CBOW task, predicting the context words by using the target words. Taking the example we discussed earlier, in the sentence, The cat sat on the dirty mat, skip-gram tries to predict the target word vectors for the, cat, sat, on, and dirty, using the context word vector for mat. Hence, for the context word mat, we predict the target words (the, cat, sat, on, dirty), which are represented as (mat, the), (mat, cat), (mat, sat), (mat, on), (mat, dirty).