Logo Oapen
  • Search
  • Join
    • Deposit
    • For Librarians
    • For Publishers
    • For Researchers
    • Funders
    • Resources
    • OAPEN
    • For Librarians
    • For Publishers
    • For Researchers
    • Funders
    • Resources
    • OAPEN
    View Item 
    •   OAPEN Home
    • View Item
    •   OAPEN Home
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Chapter Measures of interrater agreement when each target is evaluated by a different group of raters

    Thumbnail
    Download PDF Viewer
    Web Shop
    Author(s)
    BOVE, Giuseppe
    Language
    English
    Show full item record
    Abstract
    Most measures of interrater agreement are defined for ratings regarding a group of targets, each rated by the same group of raters (e.g., the agreement of raters who assess on a rating scale the language proficiency of a corpus of argumentative written texts). However, there are situations in which agreement between ratings regards a group of targets where each target is evaluated by a different group of raters, like for instance when teachers in a school are evaluated by a questionnaire administered to all the pupils (students) in the classroom. In these situations, a first approach is to evaluate the level of agreement for the whole group of targets by the ANOVA one-way random model. A second approach is to apply subject-specific indices of interrater agreement like rWG, which represents the observed variance in ratings compared to the variance of a theoretical distribution representing no agreement (i.e., the null distribution). Both these approaches are not appropriate for ordinal or nominal scales. In this paper, an index is proposed to evaluate the agreement between raters for each single target (subject or object) on an ordinal scale, and to obtain also a global measure of the interrater agreement for the whole group of cases evaluated. The index is not affected by the possible concentration of ratings on a very small number of levels of the scale, like it happens for the measures based on the ANOVA approach, and it does not depend on the definition of a null distributions like rWG. The main features of the proposal will be illustrated in a study for the assessment of learning teacher behavior in classroom collected in a research conducted in 2018 at Roma Tre University.
    Book
    ASA 2022 Data-Driven Decision Making
    URI
    https://library.oapen.org/handle/20.500.12657/74900
    Keywords
    Interrater agreement; Ordinal data; Teacher evaluation
    DOI
    10.36253/979-12-215-0106-3.28
    ISBN
    9791221501063, 9791221501063
    Publisher
    Firenze University Press, Genova University Press
    Publication date and place
    Florence, 2023
    Series
    Proceedings e report, 134
    Classification
    Society and Social Sciences
    Pages
    6
    Rights
    https://creativecommons.org/licenses/by/4.0/
    • Imported or submitted locally

    Browse

    All of OAPENSubjectsPublishersLanguagesCollections

    My Account

    LoginRegister

    Export

    Repository metadata
    Logo Oapen
    • For Librarians
    • For Publishers
    • For Researchers
    • Funders
    • Resources
    • OAPEN

    Newsletter

    • Subscribe to our newsletter
    • view our news archive

    Follow us on

    License

    • If not noted otherwise all contents are available under Attribution 4.0 International (CC BY 4.0)

    Credits

    • logo EU
    • This project received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 683680, 810640, 871069 and 964352.

    OAPEN is based in the Netherlands, with its registered office in the National Library in The Hague.

    Director: Niels Stern

    Address:
    OAPEN Foundation
    Prins Willem-Alexanderhof 5
    2595 BE The Hague
    Postal address:
    OAPEN Foundation
    P.O. Box 90407
    2509 LK The Hague

    Websites:
    OAPEN Home: www.oapen.org
    OAPEN Library: library.oapen.org
    DOAB: www.doabooks.org

     

     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Differen formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    A logged-in user can export up to 15000 items. If you're not logged in, you can export no more than 500 items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.