Skip to main content

Research Repository

Advanced Search

EasyDIAg: A tool for easy determination of interrater agreement

Holle, Henning; Holle, Henning; Rein, Robert; Rein, Robert

Authors

Profile image of Henning Holle

Dr Henning Holle H.Holle@hull.ac.uk
Reader in Psychology / Leader of Cognitive and Clinical Neuroscience group (https://www.hull.ac.uk/neuroscience)

Profile image of Henning Holle

Dr Henning Holle H.Holle@hull.ac.uk
Reader in Psychology / Leader of Cognitive and Clinical Neuroscience group (https://www.hull.ac.uk/neuroscience)

Robert Rein

Robert Rein



Abstract

Reliable measurements are fundamental for the empirical sciences. In observational research, measurements often consist of observers categorizing behavior into nominalscaled units. Since the categorization is the outcome of a complex judgment process, it is important to evaluate the extent to which these judgments are reproducible, by having multiple observers independently rate the same behavior. A challenge in determining interrater agreement for timed-event sequential data is to develop clear objective criteria to determine whether two raters’ judgments relate to the same event (the linking problem). Furthermore, many studies presently report only raw agreement indices, without considering the degree to which agreement can occur by chance alone. Here, we present a novel, free, and open-source toolbox (EasyDIAg) designed to assist researchers with the linking problem, while also providing chance-corrected estimates of interrater agreement. Additional tools are included to facilitate the development of coding schemes and rater training.

Citation

Holle, H., Holle, H., Rein, R., & Rein, R. (2015). EasyDIAg: A tool for easy determination of interrater agreement. Behavior research methods, 47(3), 837-847. https://doi.org/10.3758/s13428-014-0506-7

Acceptance Date Jan 1, 2014
Online Publication Date Aug 9, 2014
Publication Date Sep 24, 2015
Deposit Date Apr 16, 2015
Publicly Available Date Apr 16, 2015
Journal Behavior research methods
Print ISSN 1554-351X
Publisher Springer Verlag
Peer Reviewed Peer Reviewed
Volume 47
Issue 3
Pages 837-847
DOI https://doi.org/10.3758/s13428-014-0506-7
Keywords Cohen’s kappa, Toolbox, Coding, Annotation, Rater, Agreement
Public URL https://hull-repository.worktribe.com/output/372612
Publisher URL http://link.springer.com/article/10.3758/s13428-014-0506-7
Additional Information This is a description of an article which has been published in: Behavior research methods. The final publication is available at Springer via http://link.springer.com/article/10.3758/s13428-014-0506-7
Contract Date Apr 16, 2015

Files






You might also like



Downloadable Citations