Summary

In this chapter, we introduced the essentials of machine learning. We started with some easy, but still quite effective, classifiers (linear and logistic regressors, Naive Bayes, and K-Nearest Neighbors). Then, we moved on to the more advanced ones (SVM). We explained how to compose weak classifiers together (ensembles, Random Forests, Gradient Tree Boosting) and touched on three awesome gradient-boosted classifiers: XGboost, LightGBM, and CatBoost. Finally, we had a peek at the algorithms used in big data, clustering, and NLP.

In the next chapter, we are going to introduce you to the basics of visualization with Matplotlib, how to operate EDA with pandas and achieve beautiful visualizations with Seaborn, and how to set up a web server to provide information on demand.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset