XGBoost has been the leading algorithm for competitions for the past 5 years.
XGBoost is an ensemble of decision trees using a gradient boosting system. The difference between XGBoost and Random Forest lies in the structure of the trees. Fully grown decision trees grow in the random forest based on subsamples of the data, growing and expanding. The larger the tree, the more likely overfitting. Each tree is higly specialized to predict on its subsample. To achieve highest accuracy, the tree continues to split into nodes and leaves as parameters dictate allowing overfitting.
The test set below

Altering these parameters, allowing for shorter trees, distances the potential for overfitting, but at the sake of accuracy. This model below performs at 89% accuracy where as the above model performs at 93%. The change occurs in the parameter ‘max depth’ . This narrows the information that can be processed, therefore limiting the potential for accuracy and also overfitting.
