Build a decision tree using the id3 algorithm for recommending novel- types. Show clearly, entropy and gain calculations. What would you recommend to a male who is 30 years old?
Answers
Answered by
0
Explanation:
Entropy using the frequency table of two attributes:
Information Gain: The information gain is based on the decrease in entropy after a data-set is split on an attribute. Constructing a decision tree is all about finding attribute that returns the highest information gain i.e., the most homogeneous branches.
Similar questions