Xgboost Regression Training on CPU and GPU in Python | by Eligijus Bujokas | Towards Data Science
xgboost GPU performance on low-end GPU vs high-end CPU | by Laurae | Data Science & Design | Medium
CI] Build and Test XGBoost GPU algorithm with Microsoft Visual Studio 2019 · Issue #7024 · dmlc/xgboost · GitHub
Szilard [Deeper than Deep Learning] on Twitter: "5/n My recommendations are still: If you don't have a GPU, lightgbm (CPU) is the fastest. If you have a GPU, xgboost (GPU) is also
cuML PCA and Kmeans, and XGBoost on Dask-cuDF performances: (left)... | Download Scientific Diagram
Three ways to speed up XGBoost model training | Anyscale
How to Scale ML Training Models with Nvidia XGBoost and Spark on Databricks - The Databricks Blog
Accelerating the XGBoost algorithm using GPU computing [PeerJ]