Knowledge Distillation. Upadhyay, U. May, 2019. 00000
Knowledge Distillation [link]Paper  abstract   bibtex   
Knowledge distillation is model compression method in which a small model is trained to mimic a pretrained, larger model.
@misc{upadhyay_knowledge_2019,
	title = {Knowledge {Distillation}},
	url = {https://medium.com/neuralmachine/knowledge-distillation-dc241d7c2322},
	abstract = {Knowledge distillation is model compression method in which a small model is trained to mimic a pretrained, larger model.},
	language = {en},
	urldate = {2019-11-18},
	journal = {Medium},
	author = {Upadhyay, Ujjwal},
	month = may,
	year = {2019},
	note = {00000}
}

Downloads: 0