How to calculate entropy in Artificial Intelligence?
Answers
Answer:
Suppose you have data:
color height quality
===== ====== =======
green tall good
green short bad
blue tall bad
blue short medium red tall medium red short medium
To calculate the entropy for quality in this example:
X = {good, medium, bad}
x1 = {good}, x2 = {bad}, x3 = {medium}
Probability of each x in X:
Probability of each x in X:p1 = 1/6 = 0.16667
p2 = 2/6 = 0.33333
p3 = 3/6 = 0.5
for which logarithms are:
log2(p1) = -2.58496
log2(p2) = -1.58496
log2(p3) = -1.0
and therefore entropy for the set is:
H(X) = - (0.16667 * -2.58496) - (0.33333 * -1.58496) - (0.5 * -1.0) = 1.45915
by the formula in the question.
Remaining tasks are to iterate this process for each attribute to form the nodes of the tree.
Answer:
Description of entropy breaks down the formula, but I still don't know how to determine the values of X and p(x), defined as "The proportion of the number of elements in class x to the number of elements in set S".
Explanation:
HOPE IT HELPS YOU.
PLEASE MARK AS BRAINLIST ANSWER