Summary

We made it! From a very noisy dataset, we built two classifiers that solve part of our goal. Of course, we had to be pragmatic and adapt our initial goal to what was achievable. But on the way, we learned about the strengths and weaknesses of nearest-neighbor and logistic regression, and got an introduction to simple classification with neural networks. We learned how to extract features, such as LinkCountNumTextTokensNumCodeLinesAvgSentLenAvgWordLenNumAllCaps, and NumExclams, and how to analyze their impact on the classifier's performance.

But what is even more valuable is that we learned an informed way of debugging poorly performing classifiers. That will help us in the future to produce usable systems much faster.

After having looked into nearest-neighbor and logistic regression, in Chapter 5, Dimensionality Reduction, we will get familiar with yet another simple-but-powerful classification algorithm: Naïve Bayes. Along the way, we will also learn about some more convenient tools from scikit-learn.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset