Montreal, Canada
June 16, 2002
June 16, 2002
June 19, 2002
2153-5965
5
7.1027.1 - 7.1027.5
10.18260/1-2--11145
https://peer.asee.org/11145
401
Main Menu
Session #2613
Student Development of Grading and Assessment Criteria Valerie L. Young Department of Chemical Engineering, Ohio University
Abstract
Faculty at Ohio University increasingly use rubrics to simultaneously grade student work and assess student learning. One tenet of this “Criterion-Based Grading” system is that the basis for grading is known to the students, allowing them to evaluate their own work before submitting it. Beyond making such self-evaluation possible, we wish to actively encourage it. We consider the ability to evaluate one’s own work to be an essential skill and habit of a practicing engineer. However, we have learned by experience that even when students are provided with the rubric, they seldom evaluate their own work effectively. In an effort to counter this, students in a sophomore-level “Energy Balances” course are asked to help develop a rubric that will be used to grade and assess a team project in the course. The mechanism for including student input in rubric development and assessing the ability of the students to use the resulting rubrics for self evaluation will be discussed.
Introduction
The chemical engineering curriculum at Ohio University requires students to complete open- ended assignments in a team environment at the sophomore, junior, and senior levels. The deliverable in these assignments is typically a report, either oral or written. Grading and assessment of this work is complex, involving both the quality of the technical content and the quality of the presentation. This type of work also provides a high density of assessment information because of its complexity. Rubrics provide a framework for structuring and quantifying this assessment information. Rubrics, if made available to the students, should also give students a rationale for the grades they receive and an opportunity to evaluate and improve their own work prior to submission. (See, for example, Walvoord & Anderson, 1998 1.)
Grading in our senior Unit Operations Laboratory is now entirely rubric-based.2 We have been pleased with the rubrics as a foundation for assessment in this course. In some respects, the rubrics have also resulted in notable improvement in the reports. For example, reports are now more concise, focusing on the important traits of the reports as defined by the grading rubrics. On the other hand, students struggle to effectively use the descriptions in the rubrics to assess their own performance. For example, one of the traits graded for a prelab report is the pro posed statistical analysis of the results, and the description of an “A” performance begins with the phrase, “Uncertainties for all values stated.” The description of an “F” performance begins with “Uncertainties not given for most values.” Still, students submit reports in which the vocabulary of statistics features prominently, yet the method for quantifying the uncertainty on important
“Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition Copyright Ó 2002, American Society for Engineering Education”
Main Menu
Young, V. (2002, June), Student Development Of Grading And Assessment Criteria Paper presented at 2002 Annual Conference, Montreal, Canada. 10.18260/1-2--11145
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2002 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015