Incremental and decremental support vector machine learning. Cauwenberghs, G. & Poggio, T. In Proceedings of the 13th International Conference on Neural Information Processing Systems, of NIPS'00, pages 388–394, Cambridge, MA, USA, January, 2000. MIT Press.
abstract   bibtex   
An on-line recursive algorithm for training support vector machines, one vector at a time, is presented. Adiabatic increments retain the Kuhn-Tucker conditions on all previously seen training data, in a number of steps each computed analytically. The incremental procedure is reversible, and decremental "unlearning" offers an efficient method to exactly evaluate leave-one-out generalization performance. Interpretation of decremental unlearning in feature space sheds light on the relationship between generalization and geometry of the data.
@inproceedings{cauwenberghs_incremental_2000,
	address = {Cambridge, MA, USA},
	series = {{NIPS}'00},
	title = {Incremental and decremental support vector machine learning},
	abstract = {An on-line recursive algorithm for training support vector machines, one vector at a time, is presented. Adiabatic increments retain the Kuhn-Tucker conditions on all previously seen training data, in a number of steps each computed analytically. The incremental procedure is reversible, and decremental "unlearning" offers an efficient method to exactly evaluate leave-one-out generalization performance. Interpretation of decremental unlearning in feature space sheds light on the relationship between generalization and geometry of the data.},
	urldate = {2022-03-15},
	booktitle = {Proceedings of the 13th {International} {Conference} on {Neural} {Information} {Processing} {Systems}},
	publisher = {MIT Press},
	author = {Cauwenberghs, Gert and Poggio, Tomaso},
	month = jan,
	year = {2000},
	pages = {388--394},
}

Downloads: 0