Table of Contents generated with DocToc
Welcome to the wonderland of XGBoost. This page contains a curated list of awesome XGBoost examples, tutorials and blogs. It is inspired by awesome-MXNet, awesome-php and awesome-machine-learning.
- Contribution of examples, benchmarks is more than welcome!
- If you like to share how you use xgboost to solve your problem, send a pull request:)
- If you want to contribute to this list and the examples, please open a new pull request.
This is a list of short codes introducing different functionalities of xgboost packages.
- Basic walkthrough of packages python R Julia
- Customize loss function, and evaluation metric python R Julia
- Boosting from existing prediction python R Julia
- Predicting using first n trees python R Julia
- Generalized Linear Model python R Julia
- Cross validation python R Julia
- Predicting leaf indices python R
Most of examples in this section are based on CLI or python version. However, the parameter settings can be applied to all versions
"Over the last six months, a new algorithm has come up on Kaggle winning every single competition in this category, it is an algorithm called XGBoost." -- Anthony Goldbloom, Founder & CEO of Kaggle (from his presentation "What Is Winning on Kaggle?" youtube link)
XGBoost has helped on these winning solutions:
- Marios Michailidis, Mathias Müller and HJ van Veen, 1st place of the Dato Truely Native? competition. Link to the Kaggle interview.
- Vlad Mironov, Alexander Guschin, 1st place of the CERN LHCb experiment Flavour of Physics competition. Link to the Kaggle interview.
- Josef Slavicek, 3rd place of the CERN LHCb experiment Flavour of Physics competition. Link to the Kaggle interview.
- Mario Filho, Josef Feigl, Lucas, Gilberto, 1st place of the Caterpillar Tube Pricing competition. Link to the Kaggle interview.
- Qingchen Wang, 1st place of the Liberty Mutual Property Inspection. Link to [the Kaggle interview] (http://blog.kaggle.com/2015/09/28/liberty-mutual-property-inspection-winners-interview-qingchen-wang/).
- Chenglong Chen, 1st place of the Crowdflower Search Results Relevance. Link to the winning solution.
- Alexandre Barachant (“Cat”) and Rafał Cycoń (“Dog”), 1st place of the Grasp-and-Lift EEG Detection. Link to the Kaggle interview.
- Halla Yang, 2nd place of the Recruit Coupon Purchase Prediction Challenge. Link to the Kaggle interview.
- Owen Zhang, 1st place of the Avito Context Ad Clicks competition. Link to the Kaggle interview.
There are many other great winning solutions and interviews, but this list is too small to put all of them here. Please send pull requests if important ones appear.
- "Open Source Tools & Data Science Competitions" by Owen Zhang - XGBoost parameter tuning tips
- "Tips for data science competitions" by Owen Zhang - Page 14
- "XGBoost - eXtreme Gradient Boosting" by Tong He
- "How to use XGBoost algorithm in R in easy steps" by TAVISH SRIVASTAVA (Chinese Translation 中文翻译 by HarryZhu)
- "Kaggle Solution: What’s Cooking ? (Text Mining Competition)" by MANISH SARASWAT
- "Better Optimization with Repeated Cross Validation and the XGBoost model - Machine Learning with R)" by Manuel Amunategui (Youtube Link) (Github Link)
- "XGBoost Rossman Parameter Tuning" by Norbert Kozlowski
- "Featurizing log data before XGBoost" by Xavier Conort, Owen Zhang etc
- "West Nile Virus Competition Benchmarks & Tutorials" by Anna Montoya
- "Ensemble Decision Tree with XGBoost" by Bing Xu
- "Notes on eXtreme Gradient Boosting" by ARSHAK NAVRUZYAN (iPython Notebook)
- BayesBoost - Bayesian Optimization using xgboost and sklearn API
- John Chambers Award - 2016 Winner: XGBoost, by Tong He (Simon Fraser University) and Tianqi Chen (University of Washington)