To calculate Cohen's Kappa Coefficient:
\[ k = \frac{po - pe}{1 - pe} \]
Where:
Cohen's Kappa Coefficient (\( k \)) is a statistical measure used to evaluate the inter-rater reliability for categorical items. It is more robust than simple percentage agreement calculation since it takes into account the possibility of the agreement occurring by chance. The value of kappa can range from -1 to 1, where:
Let's assume the following values:
Step 1: Subtract the hypothetical probability of chance agreement from the observed agreement:
\[ po - pe = 0.75 - 0.5 = 0.25 \]
Step 2: Subtract the hypothetical probability of chance agreement from 1:
\[ 1 - pe = 1 - 0.5 = 0.5 \]
Step 3: Divide the result from step 1 by the result from step 2:
\[ k = \frac{0.25}{0.5} = 0.5 \]