Stat Tools
Stat Tools
Fast statistical calculators

Cohen’s kappa (agreement)

Inter-rater agreement for categorical labels beyond chance.

Inputs

Labels can be words or numbers. Matching is exact after trimming.

Result

Computed ✅
n: 8
Observed agreement (Po): 0.875
Expected agreement (Pe): 0.5
Kappa: 0.75
Interpretation: substantial agreement (rule-of-thumb).
R1 \\ R2 no yes Row total
no 3 0 3
yes 1 4 5
Col total 4 4 8
Step-by-step
  1. Build confusion table

    Formula: Count occurrences of (R1, R2) label pairs
    Substitute: Labels: no, yes
    Result: Table built for n=8 items
  2. Observed agreement (Po)

    Formula: Po = (Σ diagonal) / n
    Substitute: Po = 7/8
    Result: Po = 0.875
  3. Expected agreement (Pe)

    Formula: Pe = Σ (row_marg(label) · col_marg(label)) / n²
    Substitute: Compute marginals from the confusion table
    Result: Pe = 0.5
  4. Cohen’s kappa

    Formula: κ = (Po − Pe) / (1 − Pe)
    Substitute: (0.875 − 0.5) / (1 − 0.5)
    Result: κ = 0.75