Pros and cons

The examples selected in this chapter do not do justice to the versatility and accuracy of the Naïve Bayes family of classifiers.

Naïve Bayes classifiers are simple and robust generative classifiers that rely on prior conditional probabilities to extract a model from a training dataset. The Naïve Bayes has its benefits, as mentioned here:

  • Simple implementation and easy to parallelize
  • Very low computational complexity: O((n+c)*m), where m is the number of features, C the number of classes, and n the number of observations
  • Handles missing data
  • Supports incremental updates, insertions, and deletions

However, Naïve Bayes is not a silver bullet. It has the following disadvantages:

  • The assumption of the independence of features is not practical in the real world
  • It requires a large training set to achieve reasonable accuracy
  • It contains a zero-frequency problem for counters
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset