Model compression. Buciluǎ, C., Caruana, R., & Niculescu-Mizil, A. In Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining, of KDD '06, pages 535–541, New York, NY, USA, August, 2006. Association for Computing Machinery.
Model compression [link]Paper  doi  abstract   bibtex   
Often the best performing supervised learning models are ensembles of hundreds or thousands of base-level classifiers. Unfortunately, the space required to store this many classifiers, and the time required to execute them at run-time, prohibits their use in applications where test sets are large (e.g. Google), where storage space is at a premium (e.g. PDAs), and where computational power is limited (e.g. hea-ring aids). We present a method for "compressing" large, complex ensembles into smaller, faster models, usually without significant loss in performance.
@inproceedings{bucilua_model_2006,
	address = {New York, NY, USA},
	series = {{KDD} '06},
	title = {Model compression},
	isbn = {978-1-59593-339-3},
	url = {https://doi.org/10.1145/1150402.1150464},
	doi = {10.1145/1150402.1150464},
	abstract = {Often the best performing supervised learning models are ensembles of hundreds or thousands of base-level classifiers. Unfortunately, the space required to store this many classifiers, and the time required to execute them at run-time, prohibits their use in applications where test sets are large (e.g. Google), where storage space is at a premium (e.g. PDAs), and where computational power is limited (e.g. hea-ring aids). We present a method for "compressing" large, complex ensembles into smaller, faster models, usually without significant loss in performance.},
	urldate = {2020-10-23},
	booktitle = {Proceedings of the 12th {ACM} {SIGKDD} international conference on {Knowledge} discovery and data mining},
	publisher = {Association for Computing Machinery},
	author = {Buciluǎ, Cristian and Caruana, Rich and Niculescu-Mizil, Alexandru},
	month = aug,
	year = {2006},
	keywords = {model compression, supervised learning},
	pages = {535--541},
}

Downloads: 0