Louisville, Kentucky
June 20, 2010
June 20, 2010
June 23, 2010
2153-5965
First-Year Programs
9
15.1136.1 - 15.1136.9
10.18260/1-2--16755
https://peer.asee.org/16755
492
Students’ Peer Evaluation Calibration Through the Administration of Vignettes
Abstract
Peer evaluation has been widely used for measuring student performance in collaborative team work. However, students tend to be biased when rating their peers. Halo effect, central tendency effect and leniency effect are very common bias in peer evaluation. One technique to reduce the possible bias is to calibrate student peer evaluation with vignettes. A Vignette describes a hypothetic team member with specific attributes demonstrating specific characteristics meant to be assessed with peer evaluation. Through evaluating this hypothetic person and comparing students’ evaluation results with results from trained experts, we expect to measure students’ biases and provide a training opportunity to improve student rating skills and reduce rating bias.
The theoretical framework in our study operationally defines team effectiveness as interdependency, goal setting and potency. A vignette designed to illustrate attributes of interdependency, goal setting and potency was administered at different periods of the semester. Participants in the study were enrolled in the first year engineering course and assigned to work on real engineering related projects in authentic teams of 3 or 4. The authentic means that the students are put into a team working towards course related projects.
Student ratings were compared with expert ratings considering the expert’s ratings as unbiased. The differences between the students’ rating and expert’s rating were defined as students’ bias. The biases of students’ rating performance were analyzed at the item-level and the construct level. From both the item and construct level, our data show that students did not perform better with repeated vignette administrations. However, after taking the students’ bias calibration into consideration, students’ peer evaluation performance move closer to the expert’s score.
Introduction
The Accreditation Board for Engineering and Technology (ABET)1 Engineering Criteria 2000 requires engineering students should “be able to function effectively in a multidisciplinary team”. The question is: how can students’ team skills be taught and assessed2,3,4? In a previous study, we defined student’s team skills through a three-construct theoretical model: interdependency, goal setting and potency. This model entails possible application in both pedagogy and assessment. Peer evaluation has been used as an effective instrumentation tool to assess students’ team skills and performance5,6,7,8,9. We developed a 9-item peer evaluation questionnaire to measure student’s individual perceptions on their teammates along our three-constructs theoretical model10,11.
When conducting peer evaluation, students tend to create their own social situations, leading to different rater biases. Three biases are most common in the peer evaluation process: Halo effect, central tendency and leniency12,13,14. Halo effect occurs when students (rater) does not differentiate differences between subscales. When the raters do not make use of the full range of the rating scale, central tendency effect bias occurs. A rater might consistently give higher or
Wang, J., & Imbrie, P. (2010, June), Students’ Peer Evaluation Calibration Through The Administration Of Vignettes Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16755
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015