Classifying a data sample with the swimming preference decision tree

Let us construct a decision tree for the swimming preference example with the ID3 algorithm. Consider a data sample (good, cold,?) and we would like to use the constructed decision tree to decide into which class it should belong.

Start with a data sample at the root of the tree. The first attribute that branches from the root is swimming suit, so we ask for the value for the attribute swimming suit of the sample (good, cold,?). We learn that the value of the attribute is swimming suit=good; therefore, move down the rightmost branch with that value for its data samples. We arrive at the node with the attribute water temperature and ask the question: what is the value of the attribute water temperature for the data sample (good, cold,?)? We learn that for that data sample, we have water temperature=cold; therefore, we move down the left branch into the leaf node. This leaf is associated with the class swimming preference=no. Therefore, the decision tree would classify the data sample (good, cold,?) to be in that class swimming preference; that is, to complete it to the data sample (good, cold, no).

Therefore, the decision tree says that if one has a good swimming suit, but the water temperature is cold, then one would still not want to swim based on the data collected in the table.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset