Random forest gives much more accurate predictions when compared to simple regression models in many scenarios. they have high number of predictive variables and huge sample size. this is because it captures the variance of several input variables at the same time and enables high number of observations to participate in the prediction.
Answers
Answered by
1
Random forest is like bootstrapping algorithm with Decision tree (CART) model. Say, we have 1000 observation in the complete population with 10 variables. Random forest tries to build multiple CART model with different sample and different initial variables. For instance, it will take a random sample of 100 observation and 5 randomly chosen initial variables to build a CART model. It will repeat the process (say) 10 times and then make a final prediction on each observation. Final prediction is a function of each prediction. This final prediction can simply be the mean of each prediction.
Similar questions