Cohen's kappa for inter-rater reliability

Hi everyone,

I have a brief question regarding the Cohen’s kappa tool on ELAN. When the results window is produced, there is a section titled “Global results of all files and all tiers” that gives kappa, kappa max, and raw agreement values based on annotation type. Do these values INCLUDE or EXCLUDE unlinked/unmatched annotations?


Hello Minnie,

The per-value results in that section are based on the “Global per value agreement tables” a bit further on in the file. In these tables all annotations are present or included; the sum of all rows and columns in the “Overall agreement matrix” equals the sum of each per-value 2 x 2 matrix.
So, unmatched annotations are included in the mentioned kappa values.