Issues in evaluation of stream learning algorithms. Gama, J., Sebastião, R., & Rodrigues, P. P. In Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining, of KDD '09, pages 329–338, New York, NY, USA, June, 2009. Association for Computing Machinery. Paper doi abstract bibtex Learning from data streams is a research area of increasing importance. Nowadays, several stream learning algorithms have been developed. Most of them learn decision models that continuously evolve over time, run in resource-aware environments, detect and react to changes in the environment generating data. One important issue, not yet conveniently addressed, is the design of experimental work to evaluate and compare decision models that evolve over time. There are no golden standards for assessing performance in non-stationary environments. This paper proposes a general framework for assessing predictive stream learning algorithms. We defend the use of Predictive Sequential methods for error estimate - the prequential error. The prequential error allows us to monitor the evolution of the performance of models that evolve over time. Nevertheless, it is known to be a pessimistic estimator in comparison to holdout estimates. To obtain more reliable estimators we need some forgetting mechanism. Two viable alternatives are: sliding windows and fading factors. We observe that the prequential error converges to an holdout estimator when estimated over a sliding window or using fading factors. We present illustrative examples of the use of prequential error estimators, using fading factors, for the tasks of: i) assessing performance of a learning algorithm; ii) comparing learning algorithms; iii) hypothesis testing using McNemar test; and iv) change detection using Page-Hinkley test. In these tasks, the prequential error estimated using fading factors provide reliable estimators. In comparison to sliding windows, fading factors are faster and memory-less, a requirement for streaming applications. This paper is a contribution to a discussion in the good-practices on performance assessment when learning dynamic models that evolve over time.
@inproceedings{gama_issues_2009,
address = {New York, NY, USA},
series = {{KDD} '09},
title = {Issues in evaluation of stream learning algorithms},
isbn = {978-1-60558-495-9},
url = {https://doi.org/10.1145/1557019.1557060},
doi = {10.1145/1557019.1557060},
abstract = {Learning from data streams is a research area of increasing importance. Nowadays, several stream learning algorithms have been developed. Most of them learn decision models that continuously evolve over time, run in resource-aware environments, detect and react to changes in the environment generating data. One important issue, not yet conveniently addressed, is the design of experimental work to evaluate and compare decision models that evolve over time. There are no golden standards for assessing performance in non-stationary environments. This paper proposes a general framework for assessing predictive stream learning algorithms. We defend the use of Predictive Sequential methods for error estimate - the prequential error. The prequential error allows us to monitor the evolution of the performance of models that evolve over time. Nevertheless, it is known to be a pessimistic estimator in comparison to holdout estimates. To obtain more reliable estimators we need some forgetting mechanism. Two viable alternatives are: sliding windows and fading factors. We observe that the prequential error converges to an holdout estimator when estimated over a sliding window or using fading factors. We present illustrative examples of the use of prequential error estimators, using fading factors, for the tasks of: i) assessing performance of a learning algorithm; ii) comparing learning algorithms; iii) hypothesis testing using McNemar test; and iv) change detection using Page-Hinkley test. In these tasks, the prequential error estimated using fading factors provide reliable estimators. In comparison to sliding windows, fading factors are faster and memory-less, a requirement for streaming applications. This paper is a contribution to a discussion in the good-practices on performance assessment when learning dynamic models that evolve over time.},
urldate = {2022-03-15},
booktitle = {Proceedings of the 15th {ACM} {SIGKDD} international conference on {Knowledge} discovery and data mining},
publisher = {Association for Computing Machinery},
author = {Gama, João and Sebastião, Raquel and Rodrigues, Pedro Pereira},
month = jun,
year = {2009},
keywords = {data streams, evaluation design},
pages = {329--338},
}
Downloads: 0
{"_id":"cFbwbLxtaKiLbEZu8","bibbaseid":"gama-sebastio-rodrigues-issuesinevaluationofstreamlearningalgorithms-2009","author_short":["Gama, J.","Sebastião, R.","Rodrigues, P. P."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","address":"New York, NY, USA","series":"KDD '09","title":"Issues in evaluation of stream learning algorithms","isbn":"978-1-60558-495-9","url":"https://doi.org/10.1145/1557019.1557060","doi":"10.1145/1557019.1557060","abstract":"Learning from data streams is a research area of increasing importance. Nowadays, several stream learning algorithms have been developed. Most of them learn decision models that continuously evolve over time, run in resource-aware environments, detect and react to changes in the environment generating data. One important issue, not yet conveniently addressed, is the design of experimental work to evaluate and compare decision models that evolve over time. There are no golden standards for assessing performance in non-stationary environments. This paper proposes a general framework for assessing predictive stream learning algorithms. We defend the use of Predictive Sequential methods for error estimate - the prequential error. The prequential error allows us to monitor the evolution of the performance of models that evolve over time. Nevertheless, it is known to be a pessimistic estimator in comparison to holdout estimates. To obtain more reliable estimators we need some forgetting mechanism. Two viable alternatives are: sliding windows and fading factors. We observe that the prequential error converges to an holdout estimator when estimated over a sliding window or using fading factors. We present illustrative examples of the use of prequential error estimators, using fading factors, for the tasks of: i) assessing performance of a learning algorithm; ii) comparing learning algorithms; iii) hypothesis testing using McNemar test; and iv) change detection using Page-Hinkley test. In these tasks, the prequential error estimated using fading factors provide reliable estimators. In comparison to sliding windows, fading factors are faster and memory-less, a requirement for streaming applications. This paper is a contribution to a discussion in the good-practices on performance assessment when learning dynamic models that evolve over time.","urldate":"2022-03-15","booktitle":"Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining","publisher":"Association for Computing Machinery","author":[{"propositions":[],"lastnames":["Gama"],"firstnames":["João"],"suffixes":[]},{"propositions":[],"lastnames":["Sebastião"],"firstnames":["Raquel"],"suffixes":[]},{"propositions":[],"lastnames":["Rodrigues"],"firstnames":["Pedro","Pereira"],"suffixes":[]}],"month":"June","year":"2009","keywords":"data streams, evaluation design","pages":"329–338","bibtex":"@inproceedings{gama_issues_2009,\n\taddress = {New York, NY, USA},\n\tseries = {{KDD} '09},\n\ttitle = {Issues in evaluation of stream learning algorithms},\n\tisbn = {978-1-60558-495-9},\n\turl = {https://doi.org/10.1145/1557019.1557060},\n\tdoi = {10.1145/1557019.1557060},\n\tabstract = {Learning from data streams is a research area of increasing importance. Nowadays, several stream learning algorithms have been developed. Most of them learn decision models that continuously evolve over time, run in resource-aware environments, detect and react to changes in the environment generating data. One important issue, not yet conveniently addressed, is the design of experimental work to evaluate and compare decision models that evolve over time. There are no golden standards for assessing performance in non-stationary environments. This paper proposes a general framework for assessing predictive stream learning algorithms. We defend the use of Predictive Sequential methods for error estimate - the prequential error. The prequential error allows us to monitor the evolution of the performance of models that evolve over time. Nevertheless, it is known to be a pessimistic estimator in comparison to holdout estimates. To obtain more reliable estimators we need some forgetting mechanism. Two viable alternatives are: sliding windows and fading factors. We observe that the prequential error converges to an holdout estimator when estimated over a sliding window or using fading factors. We present illustrative examples of the use of prequential error estimators, using fading factors, for the tasks of: i) assessing performance of a learning algorithm; ii) comparing learning algorithms; iii) hypothesis testing using McNemar test; and iv) change detection using Page-Hinkley test. In these tasks, the prequential error estimated using fading factors provide reliable estimators. In comparison to sliding windows, fading factors are faster and memory-less, a requirement for streaming applications. This paper is a contribution to a discussion in the good-practices on performance assessment when learning dynamic models that evolve over time.},\n\turldate = {2022-03-15},\n\tbooktitle = {Proceedings of the 15th {ACM} {SIGKDD} international conference on {Knowledge} discovery and data mining},\n\tpublisher = {Association for Computing Machinery},\n\tauthor = {Gama, João and Sebastião, Raquel and Rodrigues, Pedro Pereira},\n\tmonth = jun,\n\tyear = {2009},\n\tkeywords = {data streams, evaluation design},\n\tpages = {329--338},\n}\n\n\n\n","author_short":["Gama, J.","Sebastião, R.","Rodrigues, P. P."],"key":"gama_issues_2009","id":"gama_issues_2009","bibbaseid":"gama-sebastio-rodrigues-issuesinevaluationofstreamlearningalgorithms-2009","role":"author","urls":{"Paper":"https://doi.org/10.1145/1557019.1557060"},"keyword":["data streams","evaluation design"],"metadata":{"authorlinks":{}},"html":""},"bibtype":"inproceedings","biburl":"https://bibbase.org/zotero/mh_lenguyen","dataSources":["iwKepCrWBps7ojhDx"],"keywords":["data streams","evaluation design"],"search_terms":["issues","evaluation","stream","learning","algorithms","gama","sebastião","rodrigues"],"title":"Issues in evaluation of stream learning algorithms","year":2009}