Fast learning rates for plug-in classifiers. Audibert, J. Y. & Tsybakov, A. B. https://doi.org/10.1214/009053606000001217, 35(2):608–633, 2007. doi abstract bibtex It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than n−1/2. The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n−1, and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n−1. We establish minimax lower bounds showing that the obtained rates cannot be improved.
@article{audibert_fast_2007,
title = {Fast learning rates for plug-in classifiers},
volume = {35},
doi = {10.1214/009053606000001217},
abstract = {It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than n−1/2. The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n−1, and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n−1. We establish minimax lower bounds showing that the obtained rates cannot be improved.},
number = {2},
journal = {https://doi.org/10.1214/009053606000001217},
author = {Audibert, Jean Yves and Tsybakov, Alexandre B.},
year = {2007},
keywords = {62G07, 62G08, 62H05, 68T10, classification, excess risk, fast rates of convergence, minimax lower bounds, plug-in classifiers, Statistical learning},
pages = {608--633},
annote = {The following values have no corresponding Zotero field:publisher: Institute of Mathematical Statistics},
}
Downloads: 0
{"_id":"tdpcYhCSeQZiJiCsT","bibbaseid":"audibert-tsybakov-fastlearningratesforpluginclassifiers-2007","author_short":["Audibert, J. Y.","Tsybakov, A. B."],"bibdata":{"bibtype":"article","type":"article","title":"Fast learning rates for plug-in classifiers","volume":"35","doi":"10.1214/009053606000001217","abstract":"It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than n−1/2. The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n−1, and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n−1. We establish minimax lower bounds showing that the obtained rates cannot be improved.","number":"2","journal":"https://doi.org/10.1214/009053606000001217","author":[{"propositions":[],"lastnames":["Audibert"],"firstnames":["Jean","Yves"],"suffixes":[]},{"propositions":[],"lastnames":["Tsybakov"],"firstnames":["Alexandre","B."],"suffixes":[]}],"year":"2007","keywords":"62G07, 62G08, 62H05, 68T10, classification, excess risk, fast rates of convergence, minimax lower bounds, plug-in classifiers, Statistical learning","pages":"608–633","annote":"The following values have no corresponding Zotero field:publisher: Institute of Mathematical Statistics","bibtex":"@article{audibert_fast_2007,\n\ttitle = {Fast learning rates for plug-in classifiers},\n\tvolume = {35},\n\tdoi = {10.1214/009053606000001217},\n\tabstract = {It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than n−1/2. The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n−1, and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n−1. We establish minimax lower bounds showing that the obtained rates cannot be improved.},\n\tnumber = {2},\n\tjournal = {https://doi.org/10.1214/009053606000001217},\n\tauthor = {Audibert, Jean Yves and Tsybakov, Alexandre B.},\n\tyear = {2007},\n\tkeywords = {62G07, 62G08, 62H05, 68T10, classification, excess risk, fast rates of convergence, minimax lower bounds, plug-in classifiers, Statistical learning},\n\tpages = {608--633},\n\tannote = {The following values have no corresponding Zotero field:publisher: Institute of Mathematical Statistics},\n}\n\n","author_short":["Audibert, J. Y.","Tsybakov, A. B."],"key":"audibert_fast_2007","id":"audibert_fast_2007","bibbaseid":"audibert-tsybakov-fastlearningratesforpluginclassifiers-2007","role":"author","urls":{},"keyword":["62G07","62G08","62H05","68T10","classification","excess risk","fast rates of convergence","minimax lower bounds","plug-in classifiers","Statistical learning"],"metadata":{"authorlinks":{}},"html":""},"bibtype":"article","biburl":"https://bibbase.org/f/Ceciz2iNjTZgQNtDc/mypubs_mar_2024.bib","dataSources":["m8Y57GfgnRrMKZTQS","epk5yKhDyD37NAsSC"],"keywords":["62g07","62g08","62h05","68t10","classification","excess risk","fast rates of convergence","minimax lower bounds","plug-in classifiers","statistical learning"],"search_terms":["fast","learning","rates","plug","classifiers","audibert","tsybakov"],"title":"Fast learning rates for plug-in classifiers","year":2007}