The State of Knowledge Distillation for Classification. Ruffy, F. & Chahal, K. December, 2019.
The State of Knowledge Distillation for Classification [link]Paper  abstract   bibtex   
We survey various knowledge distillation (KD) strategies for simple classification tasks and implement a set of techniques that claim state-of-the-art accuracy. Our experiments using standardized model architectures, fixed compute budgets, and consistent training schedules indicate that many of these distillation results are hard to reproduce. This is especially apparent with methods using some form of feature distillation. Further examination reveals a lack of generalizability where these techniques may only succeed for specific architectures and training settings. We observe that appropriately tuned classical distillation in combination with a data augmentation training scheme gives an orthogonal improvement over other techniques. We validate this approach and open-source our code.
@article{ruffy_state_2019,
	title = {The {State} of {Knowledge} {Distillation} for {Classification}},
	url = {https://arxiv.org/abs/1912.10850v1},
	abstract = {We survey various knowledge distillation (KD) strategies for simple
classification tasks and implement a set of techniques that claim
state-of-the-art accuracy. Our experiments using standardized model
architectures, fixed compute budgets, and consistent training schedules
indicate that many of these distillation results are hard to reproduce. This is
especially apparent with methods using some form of feature distillation.
Further examination reveals a lack of generalizability where these techniques
may only succeed for specific architectures and training settings. We observe
that appropriately tuned classical distillation in combination with a data
augmentation training scheme gives an orthogonal improvement over other
techniques. We validate this approach and open-source our code.},
	language = {en},
	urldate = {2020-03-13},
	author = {Ruffy, Fabian and Chahal, Karanbir},
	month = dec,
	year = {2019},
}

Downloads: 0