‘Right to Be Forgotten’: Analyzing the Impact of Forgetting Data Using K-NN Algorithm in Data Stream Learning. Libera, C., Miranda, L., Bernardini, F., Mastelini, S., & Viterbo, J. In Janssen, M., Csáki, C., Lindgren, I., Loukis, E., Melin, U., Viale Pereira, G., Rodríguez Bolívar, M. P., & Tambouris, E., editors, Electronic Government, of Lecture Notes in Computer Science, pages 530–542, Cham, 2022. Springer International Publishing.
doi  abstract   bibtex   
New international regulations concerning personal management data guarantee the ‘Right to Be Forgotten’. One might request to have their data erased from third-party tools and services. This requirement is especially challenging when considering the behavior of machine learning estimators that will need to forget portions of their knowledge. In this paper, we investigate the impact of these learning and forgetting policies in data stream learning. In data stream mining, the sheer volume of instances typically makes it unfeasible to store the data or retraining the learning models from scratch. Hence, more efficient solutions are needed to deal with the dynamic nature of online machine learning. We modify an incremental k-NN classifier to enable it to erase its past data and we also investigate the impact of data forgetting in the obtained predictive performance. Our proposal is compared against the original k-NN algorithm using seven non-stationary stream datasets. Our results show that the forgetting-enabled algorithm can achieve similar prediction patterns compared to the vanilla one, although it yields lower predictive performance at the beginning of the learning process. Such a scenario is a typical cold-start behavior often observed in data stream mining applications, and not necessarily related to the employed forgetting mechanisms.
@inproceedings{libera_right_2022,
	address = {Cham},
	series = {Lecture {Notes} in {Computer} {Science}},
	title = {‘{Right} to {Be} {Forgotten}’: {Analyzing} the {Impact} of {Forgetting} {Data} {Using} {K}-{NN} {Algorithm} in {Data} {Stream} {Learning}},
	isbn = {978-3-031-15086-9},
	shorttitle = {‘{Right} to {Be} {Forgotten}’},
	doi = {10.1007/978-3-031-15086-9_34},
	abstract = {New international regulations concerning personal management data guarantee the ‘Right to Be Forgotten’. One might request to have their data erased from third-party tools and services. This requirement is especially challenging when considering the behavior of machine learning estimators that will need to forget portions of their knowledge. In this paper, we investigate the impact of these learning and forgetting policies in data stream learning. In data stream mining, the sheer volume of instances typically makes it unfeasible to store the data or retraining the learning models from scratch. Hence, more efficient solutions are needed to deal with the dynamic nature of online machine learning. We modify an incremental k-NN classifier to enable it to erase its past data and we also investigate the impact of data forgetting in the obtained predictive performance. Our proposal is compared against the original k-NN algorithm using seven non-stationary stream datasets. Our results show that the forgetting-enabled algorithm can achieve similar prediction patterns compared to the vanilla one, although it yields lower predictive performance at the beginning of the learning process. Such a scenario is a typical cold-start behavior often observed in data stream mining applications, and not necessarily related to the employed forgetting mechanisms.},
	language = {en},
	booktitle = {Electronic {Government}},
	publisher = {Springer International Publishing},
	author = {Libera, Caio and Miranda, Leandro and Bernardini, Flávia and Mastelini, Saulo and Viterbo, José},
	editor = {Janssen, Marijn and Csáki, Csaba and Lindgren, Ida and Loukis, Euripidis and Melin, Ulf and Viale Pereira, Gabriela and Rodríguez Bolívar, Manuel Pedro and Tambouris, Efthimios},
	year = {2022},
	keywords = {Data stream, K-NN, Lazy learning, Right to be forgotten, Stream learning},
	pages = {530--542},
}

Downloads: 0