Computer Science, asked by hmithravgds976, 8 hours ago

In a Decision Tree Algorithm, the Information Gain measure is used to measure the uncertainity present in data.
a) False
b) True

Answers

Answered by Meghanasaisri
4

Answer:

true

Explanation:

true true true true true

Answered by Jasleen0599
3

This statement is false.

  • Entropy estimates the impurity or vulnerability present in the information. It is utilized to decide how a Decision Tree can divide the information.
  • As examined above entropy assists us with building a fitting decision tree for choosing the best splitter.
  • Entropy can be characterized as a proportion of the virtue of the sub split. Entropy generally lies between 0 to 1. The entropy of any parted can be determined by this recipe.
  • Entropy essentially lets us know how impure an assortment of information is.
  • The term impure here characterizes non-homogeneity.
  • In other words we can say, "Entropy is the measurement of homogeneity. It returns us the data about an inconsistent dataset that how impure/non-homogeneous the informational collection is."
Similar questions