Computer Science, asked by muhammedsafvankp33, 1 year ago

Which of the following is true for a decision tree?
A) Decision tree is an example of linear classifier.
B) The entropy of a node typically decreases as we go down a decision tree.
C) Entropy is a measure of purity.
D) An attribute with lower mutual information should be preferred to other attributes.

Answers

Answered by jeevan9447
5

B) The entropy of a node typically decreases as we go down a decision tree.

Answered by smartbrainz
3

The correct answer is option B)  

Explanation:  

“A decision tree” is constructed with a top-down approach from a “root node” with the partitioning of the “data into subsets” compromising instances with homogenous similar values (homogeneous).

A decision tree applies the predictive modeling method followed in statistics, data mining and machine learning. Tree models are called as classification trees when “the target variable” takes a limited set of values.

In such a tree structure, leaves indicate the “class labels and branches” indicate the concurrence of features which lead to the class labels.

 

Similar questions