The effects of different pretrained word embeddings

There are a number of pretrained word embeddings available for us to leverage. In effect, these are words and their corresponding n-dimensional word vectors, made by different research teams. Notable pretrained word vectors are GloVe, Word2vec, and fastText. In our work, we use the pretrained Word2vec word vectors, although any of the preceding word embeddings should be useful in building the NER system that we have discussed in this chapter. The reading and processing of these pretrained word embedding models will be different:

TensorBoard graph

The graph in the preceding diagram, from TensorBoard, shows how the words and character embeddings are used as input for the bidirectional LSTM. 

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset