Monte Minute‎ > ‎Monte Minute‎ > ‎

09/01/2011 - kappa

posted Sep 28, 2011, 5:35 PM by Chief Resident   [ updated Sep 28, 2011, 5:37 PM by Purnema Madahar ]
A guest contributor who asked to remain unnamed (unnecessary modesty in my opinion) sent me the following on the interpretation of kappa in response to yesterday's bit on pallor:

< 0 No agreement
0 to.20 Slight
.21 to .40 Fair
.41 to .60 Moderate
.61 to .80 Substantial
.81 to 1 Almost Perfect

Of note, this scale is extremely commonly cited, but it's arbitrary and based on the opinions of Landis and Koch.

For those of you not entirely familiar with the kappa statistic: it's a measure of agreement between observers beyond chance alone.
Landis JR, Koch G. The measurement of observer agreement for categorical data. Biometrics 1977;33:159-174.