A unified approach for assessing agreement for continuous and categorical data

Lawrence Lin, A. S. Hedayat, Wenting Wu

Research output: Contribution to journalArticlepeer-review

78 Scopus citations

Abstract

This paper proposes several Concordance Correlation Coefficient (CCC) indices to measure the agreement among k raters, with each rater having multiple (m) readings from each of the n subjects for continuous and categorical data. In addition, for normal data, this paper also proposes the coverage probability (CP) and total deviation index (TDI). Those indices are used to measure intra, inter and total agreement among all raters. Intra-rater indices are used to measure the agreement among the multiple readings from the same rater. Inter-rater indices are used to measure the agreement among different raters based on the average of multiple readings. Total-rater indices are used to measure the agreement among different raters based on individual readings. In addition to the agreement, the paper also assess intra, inter, and total precision and accuracy. Through a two-way mixed model, all CCC, precision and accuracy, TDI, and CP indices are expressed as functions of variance components, and GEE method is used to obtain the estimates and perform inferences for all the functions of variance components. Each of previous proposed approaches for assessing agreement becomes one of the special case of the proposed approach. For continuous data, when m approaches ∞, the proposed estimates reduce to the agreement indices proposed by Barnhart et al. (2005). When m = 1, the proposed estimate reduces to the ICC proposed by Carrasco and Jover (2003). When m = 1, the proposed estimate also reduces to the OCCC proposed by Lin (1989), King and Chinchilli (2001a) and Barnhart et al. (2002). When m = 1 and k = 2, the proposed estimate reduces to the original CCC proposed by Lin (1989). For categorical data, when k = 2 and m = 1, the proposed estimate and its associated inference reduce to the kappa for binary data and weighted kappa with squared weight for ordinal data.

Original languageEnglish (US)
Pages (from-to)629-652
Number of pages24
JournalJournal of Biopharmaceutical Statistics
Volume17
Issue number4
DOIs
StatePublished - Jul 2007

Keywords

  • Accuracy
  • CCC
  • CP
  • ICC
  • Inter-agreement
  • Intra-agreement
  • Kappa
  • MSD
  • Precision
  • TDI
  • Total-agreement

ASJC Scopus subject areas

  • Statistics and Probability
  • Pharmacology
  • Pharmacology (medical)

Fingerprint

Dive into the research topics of 'A unified approach for assessing agreement for continuous and categorical data'. Together they form a unique fingerprint.

Cite this