Assessing intra, inter and total agreement with replicated readings. Barnhart, H. X., Song, J., & Haber, M. J. Statistics in Medicine, 24(9):1371–1384.
Assessing intra, inter and total agreement with replicated readings [link]Paper  doi  abstract   bibtex   
In clinical studies, assessing agreement of multiple readings on the same subject plays an important role in the evaluation of continuous measurement scale. The multiple readings within a subject may be replicated readings by using the same method or/and readings by using several methods (e.g. different technologies or several raters). The traditional agreement data for a given subject often consist of either replicated readings from only one method or multiple readings from several methods where only one reading is taken from each of these methods. In the first case, only intra-method agreement can be evaluated. In the second case, traditional agreement indices such as intra-class correlation (ICC) or concordance correlation coefficient (CCC) is often reported as inter-method agreement. We argue that these indices are in fact measures of total agreement that contains both inter and intra agreement. Only if there are replicated readings from several methods for a given subject, then one can assess intra, inter and total agreement simultaneously. In this paper, we present new inter-method agreement index, inter-CCC, and total agreement index, total-CCC, for agreement data with replicated readings from several methods where the ICCs within methods are used to assess intra-method agreement for each of the several methods. The relationship of the total-CCC with the inter-CCC and the ICCs is investigated. We propose a generalized estimating equations approach for estimation and inference. Simulation studies are conducted to assess the performance of the proposed approach and data from a carotid stenosis screening study is used for illustration. Copyright © 2004 John Wiley & Sons, Ltd.
@article{barnhart_assessing_nodate,
	title = {Assessing intra, inter and total agreement with replicated readings},
	volume = {24},
	copyright = {Copyright © 2004 John Wiley \& Sons, Ltd.},
	issn = {1097-0258},
	url = {https://onlinelibrary.wiley.com/doi/abs/10.1002/sim.2006},
	doi = {10.1002/sim.2006},
	abstract = {In clinical studies, assessing agreement of multiple readings on the same subject plays an important role in the evaluation of continuous measurement scale. The multiple readings within a subject may be replicated readings by using the same method or/and readings by using several methods (e.g. different technologies or several raters). The traditional agreement data for a given subject often consist of either replicated readings from only one method or multiple readings from several methods where only one reading is taken from each of these methods. In the first case, only intra-method agreement can be evaluated. In the second case, traditional agreement indices such as intra-class correlation (ICC) or concordance correlation coefficient (CCC) is often reported as inter-method agreement. We argue that these indices are in fact measures of total agreement that contains both inter and intra agreement. Only if there are replicated readings from several methods for a given subject, then one can assess intra, inter and total agreement simultaneously. In this paper, we present new inter-method agreement index, inter-CCC, and total agreement index, total-CCC, for agreement data with replicated readings from several methods where the ICCs within methods are used to assess intra-method agreement for each of the several methods. The relationship of the total-CCC with the inter-CCC and the ICCs is investigated. We propose a generalized estimating equations approach for estimation and inference. Simulation studies are conducted to assess the performance of the proposed approach and data from a carotid stenosis screening study is used for illustration. Copyright © 2004 John Wiley \& Sons, Ltd.},
	language = {en},
	number = {9},
	urldate = {2018-07-02},
	journal = {Statistics in Medicine},
	author = {Barnhart, Huiman X. and Song, Jingli and Haber, Michael J.},
	keywords = {agreement, concordance correlation coefficient, generalized estimating equations, intraclass correlation, reliability},
	pages = {1371--1384}
}

Downloads: 0