Deakin University
Browse

File(s) under permanent embargo

Does the use of summative peer assessment in collaborative group work inhibit good judgement?

journal contribution
posted on 2019-05-01, 00:00 authored by Bhavani Sridharan, Joanna TaiJoanna Tai, David BoudDavid Boud
The accuracy and consistency of peer marking, particularly when students have the power to reward (or penalise) during formative and summative assessment regimes, is largely unknown. The objective of this study is to evaluate students’ ability and behaviour in marking their peers’ teamwork performance in a collaborative group assessment context both when the mark is counted and not counted towards their final grade. Formative and summative assessment data were obtained from 98 participants in anonymous self and peer assessment of team members’ contributions to a group assessment in business courses. The findings indicate that students are capable of accurately and consistently judging their peers’ performance to a large extent, especially in the formative evaluation of the process component of group work. However, the findings suggest significant peer grading bias when peer marks contribute to final grades. Overall, findings suggest that students are reluctant to honestly assess their peers when they realise that their actions can penalise non-contributing students. This raises questions about the appropriateness of using peer marks for summative assessment purposes. To overcome the problems identified, this paper proposes a number of measures to guide educators in effectively embedding summative peer assessment in a group assessment context.

History

Journal

Higher education

Volume

77

Issue

5

Pagination

853 - 870

Publisher

Springer

Location

Cham, Switzerland

ISSN

0018-1560

eISSN

1573-174X

Language

eng

Publication classification

C1 Refereed article in a scholarly journal

Copyright notice

2018, Springer Nature B.V

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC