Which of the following information is NOT true with respect to Random Forest? In a random forest, all of the trees are independent to each other. Random selection of an equal number of data points is used for training each of the trees. Random selection of an equal number of features used in building each of the trees. The training process of individual trees in a random forest is the same as training a decision tree.
Answers
Answered by
15
The information that is not true is In a random forest, all of the trees are independent to each other.
Explanation:
- In a random forest, multitude of decision trees are created at the training time so that tasks like classification, regression etc can be performed.
- In a random forest, the trees created are not independent of each other.
- The trees in a random forest are related through variables.
- Random forest method outperforms the decision tree method.
Answered by
12
Answer:
The training process of individual trees in a random forest is the same as training a decision tree.
Explanation:
This is not true. Training process of each tree in a Random forest is same as a decision tree except with the difference that at each node in the tree only a random selection of features is used for the split in that node.
Also, all of the trees are independent to each other in Random Forest as each tree is created with a subset of the attributes/features/variables, i.e., not all the attributes are considered while making each tree; the choice of the attributes is random. This ensures that the trees are independent of each other.
Similar questions