Exploring Few-Shot Learning Algorithms

Congratulations! We have made it to the final chapter. We have come a long way. We started off by learning what neural networks are and how they are used to recognize handwritten digits. Then we explored how to train neural networks with gradient descent algorithms. We also learned how recurrent neural networks i used for sequential tasks and how convolutional neural networks are used for image recognition. Following this, we investigated how the semantics of a text can be understood using word embedding algorithms. Then we got familiar with several different types of generative adversarial networks and autoencoders.

So far, we have learned that deep learning algorithms perform exceptionally well when we have a substantially large dataset. But how can we handle the situation when we don't have a large number of data points to learn from? For most use cases, we might not get a large dataset. In such cases, we can use few-shot learning algorithms, which do not require huge datasets to learn from. In this chapter, we will understand how exactly few-shot learning algorithms learn from a smaller number of data points and we explore different types of few-shot learning algorithms. First, we will study a popular few-shot learning algorithm called a siamese network. Following this, we will learn some other few-shot learning algorithms such as the prototypical, relation, and matching networks intuitively.

In this chapter, we will study the following topics:

  • What is few-shot learning?
  • Siamese networks
  • Architecture of siamese networks
  • Prototypical networks
  • Relation networks
  • Matching networks
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset