Pretrained models

One of the fundamental requirements for transfer learning is the presence of models that perform well on source tasks. Luckily, the deep learning world believes in sharing. Many of the state-of-the art deep learning architectures have been openly shared by their respective teams. These span across different domains, such as computer vision and NLP. We looked at some of the most well-known and well-documented architectures in Chapter 3, Understanding Deep Learning Architectures. The teams behind those networks have not just shared the results, but also their pretrained models. Pretrained models are usually shared in the form of the millions of parameters/weights the model achieved while being trained to a stable state. Pretrained models are available for everyone to use through different means. The famous deep learning Python library, keras, provides an interface to download various available pretrained networks, such as XCeption, VGG16, and InceptionV3. Along the same lines, pretrained models are also available through TensorFlow and other deep learning libraries. An even more extensive collection of pretrained models developed over the years is available through Berkley's Model Zoo ( http://caffe.berkeleyvision.org/model_zoo.html).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset