Hi, I’ve got a question about the possibility to calculate inter-coder (or annotator) reliability in ELAN version 6.4.
We would like to calculate agreement for annotation of eight coders, who coded the same video clip for one variable (i, e. on eight different tiers). I have used the “modiefied Kappa”-tool in ELAN. All tiers have a different suffix and they are all selected for calculation. However, the output file shows that only the first two tiers are actually compared to each other and not all eight tiers. I am wondering, what I am doing wrong. Is there a way to calculate agreement for more than two tiers at a time?
All the best,
Elisabeth
