Intercoder reliability

Hi, I’ve got a question about the possibility to calculate inter-coder (or annotator) reliability in ELAN version 6.4.
We would like to calculate agreement for annotation of eight coders, who coded the same video clip for one variable (i, e. on eight different tiers). I have used the “modiefied Kappa”-tool in ELAN. All tiers have a different suffix and they are all selected for calculation. However, the output file shows that only the first two tiers are actually compared to each other and not all eight tiers. I am wondering, what I am doing wrong. Is there a way to calculate agreement for more than two tiers at a time?

All the best,


This is only possible when selecting the (modified) Fleiss’ kappa option in the first step. The other (longer existing) options always compare two tiers.
A bit more information can be found in the relevant section of the manual (although I just noticed that not all paragraphs have been updated after the introduction of the Fleiss option). That section links to a Wikipedia page for background information on that variant of kappa calculation.