Online learning with kernels. Kivinen, J., Smola, A., & Williamson, R. IEEE Transactions on Signal Processing, 52(8):2165–2176, August, 2004. Conference Name: IEEE Transactions on Signal Processingdoi abstract bibtex Kernel-based algorithms such as support vector machines have achieved considerable success in various problems in batch setting, where all of the training data is available in advance. Support vector machines combine the so-called kernel trick with the large margin idea. There has been little use of these methods in an online setting suitable for real-time applications. In this paper, we consider online learning in a reproducing kernel Hilbert space. By considering classical stochastic gradient descent within a feature space and the use of some straightforward tricks, we develop simple and computationally efficient algorithms for a wide range of problems such as classification, regression, and novelty detection. In addition to allowing the exploitation of the kernel trick in an online setting, we examine the value of large margins for classification in the online setting with a drifting target. We derive worst-case loss bounds, and moreover, we show the convergence of the hypothesis to the minimizer of the regularized risk functional. We present some experimental results that support the theory as well as illustrating the power of the new algorithms for online novelty detection.
@article{kivinen_online_2004,
title = {Online learning with kernels},
volume = {52},
issn = {1941-0476},
doi = {10.1109/TSP.2004.830991},
abstract = {Kernel-based algorithms such as support vector machines have achieved considerable success in various problems in batch setting, where all of the training data is available in advance. Support vector machines combine the so-called kernel trick with the large margin idea. There has been little use of these methods in an online setting suitable for real-time applications. In this paper, we consider online learning in a reproducing kernel Hilbert space. By considering classical stochastic gradient descent within a feature space and the use of some straightforward tricks, we develop simple and computationally efficient algorithms for a wide range of problems such as classification, regression, and novelty detection. In addition to allowing the exploitation of the kernel trick in an online setting, we examine the value of large margins for classification in the online setting with a drifting target. We derive worst-case loss bounds, and moreover, we show the convergence of the hypothesis to the minimizer of the regularized risk functional. We present some experimental results that support the theory as well as illustrating the power of the new algorithms for online novelty detection.},
number = {8},
journal = {IEEE Transactions on Signal Processing},
author = {Kivinen, J. and Smola, A.J. and Williamson, R.C.},
month = aug,
year = {2004},
note = {Conference Name: IEEE Transactions on Signal Processing},
keywords = {Australia, Condition monitoring, Convergence, Gaussian processes, Hilbert space, Kernel, Signal processing algorithms, Stochastic processes, Support vector machines, Training data},
pages = {2165--2176},
}
Downloads: 0
{"_id":{"_str":"53aa542353ed86111a000577"},"__v":0,"authorIDs":[],"author_short":["Kivinen, J.","Smola, A.","Williamson, R."],"bibbaseid":"kivinen-smola-williamson-onlinelearningwithkernels-2004","bibdata":{"bibtype":"article","type":"article","title":"Online learning with kernels","volume":"52","issn":"1941-0476","doi":"10.1109/TSP.2004.830991","abstract":"Kernel-based algorithms such as support vector machines have achieved considerable success in various problems in batch setting, where all of the training data is available in advance. Support vector machines combine the so-called kernel trick with the large margin idea. There has been little use of these methods in an online setting suitable for real-time applications. In this paper, we consider online learning in a reproducing kernel Hilbert space. By considering classical stochastic gradient descent within a feature space and the use of some straightforward tricks, we develop simple and computationally efficient algorithms for a wide range of problems such as classification, regression, and novelty detection. In addition to allowing the exploitation of the kernel trick in an online setting, we examine the value of large margins for classification in the online setting with a drifting target. We derive worst-case loss bounds, and moreover, we show the convergence of the hypothesis to the minimizer of the regularized risk functional. We present some experimental results that support the theory as well as illustrating the power of the new algorithms for online novelty detection.","number":"8","journal":"IEEE Transactions on Signal Processing","author":[{"propositions":[],"lastnames":["Kivinen"],"firstnames":["J."],"suffixes":[]},{"propositions":[],"lastnames":["Smola"],"firstnames":["A.J."],"suffixes":[]},{"propositions":[],"lastnames":["Williamson"],"firstnames":["R.C."],"suffixes":[]}],"month":"August","year":"2004","note":"Conference Name: IEEE Transactions on Signal Processing","keywords":"Australia, Condition monitoring, Convergence, Gaussian processes, Hilbert space, Kernel, Signal processing algorithms, Stochastic processes, Support vector machines, Training data","pages":"2165–2176","bibtex":"@article{kivinen_online_2004,\n\ttitle = {Online learning with kernels},\n\tvolume = {52},\n\tissn = {1941-0476},\n\tdoi = {10.1109/TSP.2004.830991},\n\tabstract = {Kernel-based algorithms such as support vector machines have achieved considerable success in various problems in batch setting, where all of the training data is available in advance. Support vector machines combine the so-called kernel trick with the large margin idea. There has been little use of these methods in an online setting suitable for real-time applications. In this paper, we consider online learning in a reproducing kernel Hilbert space. By considering classical stochastic gradient descent within a feature space and the use of some straightforward tricks, we develop simple and computationally efficient algorithms for a wide range of problems such as classification, regression, and novelty detection. In addition to allowing the exploitation of the kernel trick in an online setting, we examine the value of large margins for classification in the online setting with a drifting target. We derive worst-case loss bounds, and moreover, we show the convergence of the hypothesis to the minimizer of the regularized risk functional. We present some experimental results that support the theory as well as illustrating the power of the new algorithms for online novelty detection.},\n\tnumber = {8},\n\tjournal = {IEEE Transactions on Signal Processing},\n\tauthor = {Kivinen, J. and Smola, A.J. and Williamson, R.C.},\n\tmonth = aug,\n\tyear = {2004},\n\tnote = {Conference Name: IEEE Transactions on Signal Processing},\n\tkeywords = {Australia, Condition monitoring, Convergence, Gaussian processes, Hilbert space, Kernel, Signal processing algorithms, Stochastic processes, Support vector machines, Training data},\n\tpages = {2165--2176},\n}\n\n\n\n","author_short":["Kivinen, J.","Smola, A.","Williamson, R."],"key":"kivinen_online_2004","id":"kivinen_online_2004","bibbaseid":"kivinen-smola-williamson-onlinelearningwithkernels-2004","role":"author","urls":{},"keyword":["Australia","Condition monitoring","Convergence","Gaussian processes","Hilbert space","Kernel","Signal processing algorithms","Stochastic processes","Support vector machines","Training data"],"metadata":{"authorlinks":{}},"downloads":0,"html":""},"bibtype":"article","biburl":"https://bibbase.org/zotero/mh_lenguyen","creationDate":"2014-06-25T04:46:27.260Z","downloads":0,"keywords":["australia","condition monitoring","convergence","gaussian processes","hilbert space","kernel","signal processing algorithms","stochastic processes","support vector machines","training data"],"search_terms":["online","learning","kernels","kivinen","smola","williamson"],"title":"Online learning with kernels","year":2004,"dataSources":["9cexBw6hrwgyZphZZ","iwKepCrWBps7ojhDx"]}