Computer Science, asked by Akarsh8156, 1 year ago

What is the entropy for a decision tree data-set with 9 positive and 5 negative examples?

Answers

Answered by ankurbadani84
5

Answer:

entropy for a decision tree data-set with 9 positive and 5 negative - 0.940

Explanation:

Entropy defined the level of impurity in a group of examples.

Entropy = ∑ -p * log₂ (pi)

Entropy ( [9+ , 5-] ) = - (9/14) $\cdot log_2$(9/14) - (5/14) $\cdot log_2$(5/14)

Entropy ( [9+ , 5-] ) = 0.940

Similar questions