XGBoost: A Scalable Tree Boosting System. Chen, T. & Guestrin, C. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, of KDD '16, pages 785–794, New York, NY, USA, August, 2016. Association for Computing Machinery.
XGBoost: A Scalable Tree Boosting System [link]Paper  doi  abstract   bibtex   
Tree boosting is a highly effective and widely used machine learning method. In this paper, we describe a scalable end-to-end tree boosting system called XGBoost, which is used widely by data scientists to achieve state-of-the-art results on many machine learning challenges. We propose a novel sparsity-aware algorithm for sparse data and weighted quantile sketch for approximate tree learning. More importantly, we provide insights on cache access patterns, data compression and sharding to build a scalable tree boosting system. By combining these insights, XGBoost scales beyond billions of examples using far fewer resources than existing systems.
@inproceedings{chen_xgboost_2016,
	address = {New York, NY, USA},
	series = {{KDD} '16},
	title = {{XGBoost}: {A} {Scalable} {Tree} {Boosting} {System}},
	isbn = {978-1-4503-4232-2},
	shorttitle = {{XGBoost}},
	url = {https://doi.org/10.1145/2939672.2939785},
	doi = {10.1145/2939672.2939785},
	abstract = {Tree boosting is a highly effective and widely used machine learning method. In this paper, we describe a scalable end-to-end tree boosting system called XGBoost, which is used widely by data scientists to achieve state-of-the-art results on many machine learning challenges. We propose a novel sparsity-aware algorithm for sparse data and weighted quantile sketch for approximate tree learning. More importantly, we provide insights on cache access patterns, data compression and sharding to build a scalable tree boosting system. By combining these insights, XGBoost scales beyond billions of examples using far fewer resources than existing systems.},
	urldate = {2022-08-22},
	booktitle = {Proceedings of the 22nd {ACM} {SIGKDD} {International} {Conference} on {Knowledge} {Discovery} and {Data} {Mining}},
	publisher = {Association for Computing Machinery},
	author = {Chen, Tianqi and Guestrin, Carlos},
	month = aug,
	year = {2016},
	keywords = {large-scale machine learning},
	pages = {785--794},
}

Downloads: 0