Skip to main content

Research Repository

See what's under the surface

EasyDIAg: A tool for easy determination of interrater agreement

Holle, Henning; Holle, Henning; Rein, Robert; Rein, Robert

Authors

Dr Henning Holle H.Holle@hull.ac.uk
Reader in Psychology / Leader of Cognitive and Clinical Neuroscience group ( www.hull.ac.uk/neuroscience )

Dr Henning Holle H.Holle@hull.ac.uk
Reader in Psychology / Leader of Cognitive and Clinical Neuroscience group ( www.hull.ac.uk/neuroscience )

Robert Rein

Robert Rein



Abstract

Reliable measurements are fundamental for the empirical sciences. In observational research, measurements often consist of observers categorizing behavior into nominalscaled units. Since the categorization is the outcome of a complex judgment process, it is important to evaluate the extent to which these judgments are reproducible, by having multiple observers independently rate the same behavior. A challenge in determining interrater agreement for timed-event sequential data is to develop clear objective criteria to determine whether two raters’ judgments relate to the same event (the linking problem). Furthermore, many studies presently report only raw agreement indices, without considering the degree to which agreement can occur by chance alone. Here, we present a novel, free, and open-source toolbox (EasyDIAg) designed to assist researchers with the linking problem, while also providing chance-corrected estimates of interrater agreement. Additional tools are included to facilitate the development of coding schemes and rater training.

Publication Date Sep 24, 2015
Journal Behavior research methods
Print ISSN 1554-351X
Electronic ISSN 1554-3528
Publisher Springer Verlag
Peer Reviewed Peer Reviewed
Volume 47
Issue 3
Pages 837-847
APA6 Citation Holle, H., Holle, H., Rein, R., & Rein, R. (2015). EasyDIAg: A tool for easy determination of interrater agreement. Behavior research methods, 47(3), 837-847. https://doi.org/10.3758/s13428-014-0506-7
DOI https://doi.org/10.3758/s13428-014-0506-7
Keywords Cohen’s kappa, Toolbox, Coding, Annotation, Rater, Agreement
Publisher URL http://link.springer.com/article/10.3758/s13428-014-0506-7
Copyright Statement ©2014 University of Hull
Additional Information This is a description of an article which has been published in: Behavior research methods. The final publication is available at Springer via http://link.springer.co....3758/s13428-014-0506-7

Files



You might also like



Downloadable Citations