Select all that apply with regards to OOB or Out of Bag Error. (More than one option may be correct)
All the data points in the training set are considered while calculating the OOB error. All the data points in the training and test sets are considered while calculating the OOB error. Each data point in the training set is considered for all the trees in the random forest while calculating the OOB error.
Each data point in the training set is considered only for some of the trees in the random forest while calculating OOB error.
Answers
Answer:
Explanation:Bootstrapping and Out-of-Bag Sample
New training sets for multiple decision trees in Random Forest are made using the concept of Bootstrapping, which is basically random sampling with replacement.
Let us look at an example to understand how bootstrapping works:
Bootstrapping and Out-of-Bag Scoring
Here, the main training dataset consists of five animals, and now to make different samples out of this one main training set.
Fix the sample size
Randomly choose a data point for a sample
After selection, keep it back in the main set (replacement)
Again choose a data point from the main training set for the sample and after selection, keep it back.
Perform the above steps, till we reach the specified sample size.
Note: Random forest bootstraps both data points and features while making multiple indepedent decision trees
Answer:
All the data points in the training set are considered while calculating the OOB error.
Each data point in the training set is considered only for some of the trees in the random forest while calculating OOB error.
Explanation:
sfkjsnfkj