Dr Henning Holle H.Holle@hull.ac.uk
Reader in Psychology / Leader of Cognitive and Clinical Neuroscience group (https://www.hull.ac.uk/neuroscience)
EasyDIAg: A tool for easy determination of interrater agreement
Holle, Henning; Holle, Henning; Rein, Robert; Rein, Robert
Authors
Dr Henning Holle H.Holle@hull.ac.uk
Reader in Psychology / Leader of Cognitive and Clinical Neuroscience group (https://www.hull.ac.uk/neuroscience)
Robert Rein
Robert Rein
Abstract
Reliable measurements are fundamental for the empirical sciences. In observational research, measurements often consist of observers categorizing behavior into nominalscaled units. Since the categorization is the outcome of a complex judgment process, it is important to evaluate the extent to which these judgments are reproducible, by having multiple observers independently rate the same behavior. A challenge in determining interrater agreement for timed-event sequential data is to develop clear objective criteria to determine whether two raters’ judgments relate to the same event (the linking problem). Furthermore, many studies presently report only raw agreement indices, without considering the degree to which agreement can occur by chance alone. Here, we present a novel, free, and open-source toolbox (EasyDIAg) designed to assist researchers with the linking problem, while also providing chance-corrected estimates of interrater agreement. Additional tools are included to facilitate the development of coding schemes and rater training.
Citation
Holle, H., Holle, H., Rein, R., & Rein, R. (2015). EasyDIAg: A tool for easy determination of interrater agreement. Behavior research methods, 47(3), 837-847. https://doi.org/10.3758/s13428-014-0506-7
Acceptance Date | Jan 1, 2014 |
---|---|
Online Publication Date | Aug 9, 2014 |
Publication Date | Sep 24, 2015 |
Deposit Date | Apr 16, 2015 |
Publicly Available Date | Apr 16, 2015 |
Journal | Behavior research methods |
Print ISSN | 1554-351X |
Publisher | Springer Verlag |
Peer Reviewed | Peer Reviewed |
Volume | 47 |
Issue | 3 |
Pages | 837-847 |
DOI | https://doi.org/10.3758/s13428-014-0506-7 |
Keywords | Cohen’s kappa, Toolbox, Coding, Annotation, Rater, Agreement |
Public URL | https://hull-repository.worktribe.com/output/372612 |
Publisher URL | http://link.springer.com/article/10.3758/s13428-014-0506-7 |
Additional Information | This is a description of an article which has been published in: Behavior research methods. The final publication is available at Springer via http://link.springer.com/article/10.3758/s13428-014-0506-7 |
Contract Date | Apr 16, 2015 |
Files
10726 Holle.pdf
(652 Kb)
PDF
Copyright Statement
©2014 University of Hull
You might also like
Attentional bias in psoriasis: The role of processing time and emotional valence
(2023)
Journal Article
The Many Challenges of Human Experimental Itch Research
(2023)
Book Chapter
No preconscious attentional bias towards itch in healthy individuals
(2022)
Journal Article
Can contagious itch be affected by positive and negative suggestions?
(2022)
Journal Article
Downloadable Citations
About Repository@Hull
Administrator e-mail: repository@hull.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search