Stat Tools
Fast statistical calculators
Tools
Examples
Test Selector
Significance
t-test
ANOVA
Std Dev & Uncertainty
Confidence Interval
Sample Size
Multiple Testing
Effect Size
Outliers
Cohen’s kappa (agreement)
Inter-rater agreement for categorical labels beyond chance.
Inputs
Rater 1 labels (one per item)
yes no yes yes no yes no yes
Rater 2 labels (one per item)
yes no no yes no yes no yes
Compute
Labels can be words or numbers. Matching is exact after trimming.
Result
Computed ✅
n:
8
Observed agreement (Po):
0.875
Expected agreement (Pe):
0.5
Kappa:
0.75
Interpretation:
substantial agreement (rule-of-thumb).
R1 \\ R2
no
yes
Row total
no
3
0
3
yes
1
4
5
Col total
4
4
8
Step-by-step
Build confusion table
Formula:
Count occurrences of (R1, R2) label pairs
Substitute:
Labels: no, yes
Result:
Table built for n=8 items
Observed agreement (Po)
Formula:
Po = (Σ diagonal) / n
Substitute:
Po = 7/8
Result:
Po = 0.875
Expected agreement (Pe)
Formula:
Pe = Σ (row_marg(label) · col_marg(label)) / n²
Substitute:
Compute marginals from the confusion table
Result:
Pe = 0.5
Cohen’s kappa
Formula:
κ = (Po − Pe) / (1 − Pe)
Substitute:
(0.875 − 0.5) / (1 − 0.5)
Result:
κ = 0.75