[Corpora-List] kappas

Norton Roman nortontr at gmail.com
Wed Oct 1 12:10:22 CEST 2008


there's also Krippendorff's alpha, which corrects for some biases in cohen's kappa (it can also be used with multiple coders and missing data):

http://www.asc.upenn.edu/usr/krippendorff/dogs.html

cheers

2008/10/1 Francis Tyers <ftyers at prompsit.com>


> El mié, 01-10-2008 a las 11:35 +0200, Steen, G.J. escribió:
> > Dear all,
> >
> > one way to measure inter-coder reliability is Cohen's kappa. But this
> > can only be applied to pairs of raters, at least in the standard use.
> >
> > One solution to the problem of having more than two coders is to
> > average Cohen's kappas across all possible pairs of raters, but I am
> > not sure how this is looked upon in the testing community.
> >
> > Another solution to this problem appears to be Fleiss' kappa, which
> > can accommodate more raters in one reliability analysis. What sort of
> > experience do you have with this statistic? And are there any software
> > packages that include it (since SPSS does not seem to have it)?
> >
> > Any advice will be greatly appreciated.
>
> There are Java and Python implementations of Fleiss' kappa on Wikibooks:
>
> http://en.wikibooks.org/wiki/Algorithm_implementation/Statistics/Fleiss%
> 27_kappa<http://en.wikibooks.org/wiki/Algorithm_implementation/Statistics/Fleiss%27_kappa>
>
> There are some other statistics outlined on Wikipedia (which I suppose
> you've already seen):
>
> http://en.wikipedia.org/wiki/Interrater_reliability
>
> The choice of statistic largely depends on the experiment.
>
> Fran
>
>
> _______________________________________________
> Corpora mailing list
> Corpora at uib.no
> http://mailman.uib.no/listinfo/corpora
>
-------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: text/html Size: 2376 bytes Desc: not available Url : https://mailman.uib.no/public/corpora/attachments/20081001/7f31c096/attachment.txt



More information about the Corpora mailing list