Computer Science, asked by visiravi20, 3 months ago

What is the entropy for a decision tree data-set with 6 positive and 4 negative examples.​

Answers

Answered by gulabshani12
0

Answer:

entropy for decision tree data-set with a 6 positive and 4 negative-0.940

Answered by ChitranjanMahajan
2

The Entropy of the given Decision Tree with 6 positive and 4 negative samples is 0.9711.

Given :

Number of Positive Examples = 6

Number of Negative Examples = 4

To Find :

Entropy of the Decision tree dataset

Solution :

Total number of examples = 10

Probability of Positive Examples ( P1 )

         = Given Number of Positive Examples / Total number of examples

         = 6/10

         = 0.6

Probability of Negative Examples ( P2 )

         = Given Number of Negative Examples / Total number of examples

         = 4/10

         = 0.4

The formula for calculating Entropy for a Decision Tree dataset having a number of examples n1 and n2 for 2 categories is :

                 E = -[ (P1*log_{2}P1)  +  (P2*log_{2}P2) ]

Where,     P1 = n1 / (n1 + n2 )

                P2 = n2 / (n1 + n2 )

Here,  E = -[ (P1*log_{2}P1)  +  (P2*log_{2}P2) ]

              = -[ (0.4*log_{2}0.4)  +  (0.6*log_{2}0.6) ]

              = -[ (0.4* (-1.322)  +  (0.6* (-0.737) ]

              = -[ -0.5288 +  -(0.4422) ]

              = 0.4422 + 0.5288

              = 0.9711

Hence, the Entropy of the given decision tree with 6 positive and 4 negative data-set is 0.9711.

To learn more about Decision Trees, visit

https://brainly.in/question/4248114

#SPJ6

             

Similar questions