June 22, 2008
June 22, 2008
June 25, 2008
13.315.1 - 13.315.19
Comparison of Two Peer Evaluation Instruments for Project Teams
The College of Engineering at the University of Notre Dame has utilized a paper-pencil instrument for peer evaluations since 2005 as a portion of the assessment of project team efforts (typically 4-5 students per team) in its First Year Engineering Course. The College was considering moving from paper-pencil peer evaluations to an on-line, behaviorally based evaluation instrument, CATME1. The instructors at Notre Dame conducted a comparative study of student feedback on these two instruments during the fall 2007. During the fall semester, the students (~380) within the first year course were divided into two groups, one group using the paper-pencil instrument and the second group using CATME, both groups of approximately equal size. After completion of peer evaluations for a seven-week course project, the students were required to complete a survey providing their reaction to the instrument they used in terms of perceived simplicity, comfort, confidentiality, usefulness of feedback, and overall experience. Comparison of results from the surveys provided insight into both the relative merit and drawbacks of the two administrations. Several of the follow up survey questions comparing the instruments did not show statistically significant differences in the sample means. In spite of the confounding of the instrument design and the administration method, useful results emerged. The biggest differences in student survey results were seen in the areas of feedback and overall experience, both of which were higher for CATME. Student confidence in instructor confidentiality (keeping their comments confidential) was high for both instruments, but it was slightly higher for the paper-pencil instrument. Because student perception of the quality of the feedback is critical to both rater accuracy and the student learning experience, this study enabled the College to make a data-driven decision to use the CATME instrument in future offerings of the first year course.
College students, regardless of their field of study, commonly work collaboratively in groups on course assignments. The benefits of collaborative learning have been well documented2,3,4 and are rarely disputed. However, collaboration can lead to difficulties in evaluating the work of individual students. For example, how can instructors ensure that all students are contributing appropriately towards the completion of a project? There are often concerns over hitchhicking, a phenomenon wherein a student does not contribute adequately towards the project goals and allows teammates to do the majority of the project work. There is a disconnect because the instructor is not typically present for much of the time the group spends working on a project outside of class, yet the instructor must assign individual course grades. Social dominance is another potential issue, wherein a student takes over a project and does not allow other group members to contribute to project goals in a meaningful way. Given these challenges, finding an effective method to assess and assign individual contributions to group work is a topic of much research and debate within the education community, with substantial attention being paid to the benefit, and possible limitations, of peer evaluation methods5,6,7.
Meyers, K., & Silliman, S., & Ohland, M., & McWilliams, L., & Kijewski-Correa, T. (2008, June), Comparison Of Two Peer Evaluation Instruments For Project Teams Paper presented at 2008 Annual Conference & Exposition, Pittsburgh, Pennsylvania. https://peer.asee.org/3437
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2008 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015