Information gain

The information gain is the amount of the information entropy gained as a result of a certain procedure. For example, if we would like to know the results of three fair coins, then its information entropy is 3. But if we could look at the third coin, then the information entropy of the result for the remaining two coins would be 2. Thus, by looking at the third coin, we gained one bit information, so the information gain is 1.

We may also gain the information entropy by dividing the whole set S into sets, grouping them by a similar pattern. If we group elements by their value of an attribute A, then we define the information gain as:

where Sv is a set with the elements of S that have the value v for the attribute A.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset