Meta-ADD: A meta-learning based pre-trained model for concept drift active detection. Yu, H., Zhang, Q., Liu, T., Lu, J., Wen, Y., & Zhang, G. Information Sciences, 608:996–1009, August, 2022.
Meta-ADD: A meta-learning based pre-trained model for concept drift active detection [link]Paper  doi  abstract   bibtex   
Concept drift is a phenomenon that commonly happened in data streams and need to be detected, because it means the statistical properties of a target variable, which the model is trying to predict, change over time in an unseen way. Most current detection methods are based on a hypothesis test framework. As a result, in these detection methods, a hypothesis test is need to be set, and more importantly, cannot obtain the type of drift. The setting of a hypothesis test requires an understanding of data streams, and cannot obtain the type of concept drift results in the loss of drift information. Hence, in this paper, to get rid of the setting of hypothesis test, and obtain the type of concept drift, we propose Active Drift Detection based on Meta learning (Meta-ADD), a novel framework that learns to classify concept drift by offline pre-training a model on data stream with known drifts, then online fine-tuning model to improve detection accuracy. Specifically, in the pre-trained phase, we extract meta-features based on the error rates of various concept drift, after which a pre-trained model called meta-detector is developed via a prototypical neural network by representing various concept drift classes as corresponding prototypes. In the detection phase, the meta-detector is fine-tuned to adapt to the real data stream via a simple stream-based active learning. Hence, Meta-ADD does not need a hypothesis test to detect concept drifts and identify their types automatically, which can directly support drift understand. The experiment results verify the effectiveness of Meta-ADD.
@article{yu_meta-add_2022,
	title = {Meta-{ADD}: {A} meta-learning based pre-trained model for concept drift active detection},
	volume = {608},
	issn = {0020-0255},
	shorttitle = {Meta-{ADD}},
	url = {https://www.sciencedirect.com/science/article/pii/S0020025522007125},
	doi = {10.1016/j.ins.2022.07.022},
	abstract = {Concept drift is a phenomenon that commonly happened in data streams and need to be detected, because it means the statistical properties of a target variable, which the model is trying to predict, change over time in an unseen way. Most current detection methods are based on a hypothesis test framework. As a result, in these detection methods, a hypothesis test is need to be set, and more importantly, cannot obtain the type of drift. The setting of a hypothesis test requires an understanding of data streams, and cannot obtain the type of concept drift results in the loss of drift information. Hence, in this paper, to get rid of the setting of hypothesis test, and obtain the type of concept drift, we propose Active Drift Detection based on Meta learning (Meta-ADD), a novel framework that learns to classify concept drift by offline pre-training a model on data stream with known drifts, then online fine-tuning model to improve detection accuracy. Specifically, in the pre-trained phase, we extract meta-features based on the error rates of various concept drift, after which a pre-trained model called meta-detector is developed via a prototypical neural network by representing various concept drift classes as corresponding prototypes. In the detection phase, the meta-detector is fine-tuned to adapt to the real data stream via a simple stream-based active learning. Hence, Meta-ADD does not need a hypothesis test to detect concept drifts and identify their types automatically, which can directly support drift understand. The experiment results verify the effectiveness of Meta-ADD.},
	language = {en},
	urldate = {2022-07-12},
	journal = {Information Sciences},
	author = {Yu, Hang and Zhang, Qingyong and Liu, Tianyu and Lu, Jie and Wen, Yimin and Zhang, Guangquan},
	month = aug,
	year = {2022},
	keywords = {Concept drift, Drift detection, Pre-trained model, Prototypical neural networks},
	pages = {996--1009},
}

Downloads: 0