Interrater disagreement resolution: A systematic procedure to reach consensus in annotation tasks. Oortwijn, Y., Ossenkoppele, T., & Betti, A. In Proceedings of the workshop on Human Evaluation of NLP Systems (HumEval) at EACL 2021 (virtual) 19-20 April 2021, Kyiv, Ukraine, 2021.
abstract   bibtex   
We present a systematic procedure for interrater disagreement resolution. The procedure is general, but of particular use in multipleannotator tasks geared towards ground truth construction. We motivate our proposal by arguing that, barring cases in which the researchers’ goal is eliciting disagreement, interrater disagreement is a sign of poor quality in the design or the description of a task. Consensus among annotators, we maintain, should be striven for, through a systematic procedure for disagreement resolution such as the one we describe.
@inproceedings{oortwijn_interrater_2021,
	address = {Kyiv, Ukraine},
	title = {Interrater disagreement resolution: {A} systematic procedure to reach consensus in annotation tasks},
	abstract = {We present a systematic procedure for interrater
disagreement resolution. The procedure
is general, but of particular use in multipleannotator
tasks geared towards ground truth
construction. We motivate our proposal by
arguing that, barring cases in which the researchers’
goal is eliciting disagreement, interrater
disagreement is a sign of poor quality in
the design or the description of a task. Consensus
among annotators, we maintain, should
be striven for, through a systematic procedure
for disagreement resolution such as the one we
describe.},
	booktitle = {Proceedings of the workshop on {Human} {Evaluation} of {NLP} {Systems} ({HumEval}) at {EACL} 2021 (virtual) 19-20 {April} 2021},
	author = {Oortwijn, Yvette and Ossenkoppele, Thijs and Betti, Arianna},
	year = {2021},
}

Downloads: 0