By Steve Bentley, Learning Technologist at University of Huddersfield
Originally published 6 October 2020
Group and collaborative assessment activities have never been more important, what with their links to employability and a shift towards more authentic assessment, however they also bring about a question of fairness; is it fair for all members in a group to receive the same grade? Students do not always think so, and some scholars have gone so far as to describe awarding the same grade to all students in a group as “unethical”.
As the equity of group assessment is debated there’s been a move towards peer assessment (also known as peer evaluation) instead, where students are invited to provide feedback and/or a mark on the other members of their group, which in turn contributes to a final mark.
With this move towards peer assessment, The University of Huddersfield developed a flexible peer assessment facility that is built off existing Brightspace functionality (content, groups, assignment and grades), removing requirements for integrations and the data protection governance associated with third party products.
This tool allows each student to allocate a pool of points across their group members for any number of assessment criterion: 100 x n (n = number of students in the group). This process could include a self-evaluation depending on the Instructor’s preference at the time of creating the assessment, and could also allow students to provide some comments that justify the marks given to their peers.
Example: For a group of 4 - with self-assessment turned on - students have 400 points to assign over the four group members for each assessment criteria. Without self-assessment, this 400 would be 300.
The Instructor can review the marks in a screen which averages the students’ marks and allows them to be published to a grade item along with anonymised versions of the comments. That peer-sourced mark can be combined with the instructor’s mark using the usual Grades Tool features (like a formula grade item).
Figure: Example of the Peer Assessment tool.
The result is a streamlined workflow which is convenient for both instructor and student, meaning that running multiple rounds of peer assessment - the recommended best practice - becomes viable.
The attached zip file contains the script and admin installation instructions as well as a guide for instructors.
Feedback and Support
Although we hope to make additional improvements and updates to the Peer Assessment Tool in the future, at this point D2L is not committed to supporting or maintaining it. Nevertheless, we welcome your feedback and encourage you to share tips, tricks, extensions, etc., with your fellow Community members