Though both random forests and boosting trees are prone to overfitting, boosting models are more prone. Random forest build treees in parallel and thus are fast and also efficient. Parallelism can also be achieved in boosted trees. XGBoost 1, a gradient boosting library, is quite famous on kaggle 2 for its better results. It provides a parallel tree boosting (also known as GBDT, GBM).
XGBoost is the leading model for working with standard tabular data (the type of data you store in Pandas DataFrames, as opposed to more exotic types of data like images and videos).
16 Recently, XGBoost has received a lot of attention. Hou et al. (2018) used single DT, RF, 17 GBM and XGBoost to predict roadway traffic flows and found similar accuracy across methods, with XGBoost requiring the lowest computing18 times. Wang (2018) used the 19 XGBoost, lightGBM and DTs methods to predict travel mode choices and found the