What is the entropy for a decision tree data-set with 6 positive and 4 negative examples.
Answers
Answer:
entropy for decision tree data-set with a 6 positive and 4 negative-0.940
The Entropy of the given Decision Tree with 6 positive and 4 negative samples is 0.9711.
Given :
Number of Positive Examples = 6
Number of Negative Examples = 4
To Find :
Entropy of the Decision tree dataset
Solution :
Total number of examples = 10
Probability of Positive Examples ( P1 )
= Given Number of Positive Examples / Total number of examples
= 6/10
= 0.6
Probability of Negative Examples ( P2 )
= Given Number of Negative Examples / Total number of examples
= 4/10
= 0.4
The formula for calculating Entropy for a Decision Tree dataset having a number of examples n1 and n2 for 2 categories is :
Where,
Here,
Hence, the Entropy of the given decision tree with 6 positive and 4 negative data-set is 0.9711.
To learn more about Decision Trees, visit
https://brainly.in/question/4248114
#SPJ6