Computer Science, asked by infoexportingglobal, 1 day ago

Consider decision tree A learned with min_samples_leaf = 500. Now consider decision tree B trained on the same dataset and parameters, except that the min_samples_leaf=50. Which of the following is/are always true?

The depth of B >= the depth of A

The number of nodes in B >= the number of nodes in A

The test error of B <= the test error of A

The training error of B <= the training error of A

Answers

Answered by svkhairnar
21

Answer:

The depth of B >= the depth of A

The number of nodes in B >= the number of nodes in A

The training error of B <= the training error of A

Explanation:

min_samples_leaf guarantees a minimum number of samples in a leaf. Higher no of this parameter means you are stopping early. A lower value allows you to grow further

min_samples_leaf guarantees a minimum number of samples in a leaf. Higher no of this parameter means you are stopping early. A lower value allows you to grow further. As the tree grows no of nodes increases.

min_samples_leaf guarantees a minimum number of samples in a leaf. Higher no of this parameter means you are stopping early. A lower value allows you to grow further. As the tree grows no of nodes increases. With more nodes and deeper tree , it tends to memorize training data and variance of the model increases.

Answered by thangtran
0

Answer:

A and

Explanation:

Similar questions