Asee peer logo

Algorithm for Consistent Grading in an Introduction to Engineering Course

Download Paper |


2020 ASEE Virtual Annual Conference Content Access


Virtual On line

Publication Date

June 22, 2020

Start Date

June 22, 2020

End Date

June 26, 2021

Conference Session

First-Year Programs: Assessment in the First Year

Tagged Division

First-Year Programs

Page Count




Permanent URL

Download Count


Request a correction

Paper Authors


Joshua A. Enszer University of Delaware

visit author page

Joshua Enszer is an associate professor in Chemical and Biomolecular Engineering at the University of Delaware. He has taught core and elective courses across the curriculum, from introduction to engineering science and material and energy balances to process control, capstone design, and mathematical modeling of chemical and environmental systems. His research interests include technology and learning in various incarnations: electronic portfolios as a means for assessment and professional development, implementation of computational tools across the chemical engineering curriculum, and game-based learning.

visit author page


Jenni M. Buckley University of Delaware

visit author page

Dr. Buckley is an Associate Professor of Mechanical Engineering at University of Delaware. She received her Bachelor’s of Engineering (2001) in Mechanical Engineering from the University of Delaware, and her MS (2004) and PhD (2006) in Mechanical Engineering from the University of California, Berkeley, where she worked on computational and experimental methods in spinal biomechanics. Since 2006, her research efforts have focused on the development and mechanical evaluation of medical and rehabilitation devices, particularly orthopaedic, neurosurgical, and pediatric devices. She teaches courses in design, biomechanics, and mechanics at University of Delaware and is heavily involved in K12 engineering education efforts at the local, state, and national levels.

visit author page

Download Paper |


This Complete Evidence-based Practice paper will describe the design and implementation of rubrics in a large-enrollment introduction to engineering course.

Timely and meaningful feedback is important to student learning but challenging to deliver in large enrollment classes. The use of rubrics is virtually mandatory to ensure clear communication of expectations and consistency in evaluation. We have implemented a rubric algorithm to address the time-based challenges of both rubric design and implementation.

Rubrics are used to clarify expectations for student work in advance, and also to evaluate submitted student work. The two main elements of a rubric are the criteria and the standards. The criteria (usually the “rows”) of a rubric are the characteristics of work that are evaluated, while the standards (usually the “columns”) establish levels of quality. The mechanics of rubric construction are explored in detail by Stevens and Levi. Most of their example rubrics have four to six criteria assessed against three standard levels. They suggest constructing these rubric starting with the “outside” columns and working inward – for each criterion, first establish the highest standard level, then the lowest standard level, and then fill in the middle level(s). This style of rubric can become more cumbersome to construct with more standards. It has been suggested to design rubrics with an even number of standards to avoid a “middle” option during evaluation.

We have developed the rubrics for our Engineering 101 course by focusing only on the two outermost columns of each rubric, describing only the highest quality level (which earns full credit, an A grade) and the minimum acceptable quality level (which earns credit roughly equivalent to a C or C- grade). The rest of the columns in the rubric are effectively left blank, but with a deliberate algorithm in mind that expands the rubric from having two columns to having six – two columns are between A and C-, which represent being closer to the A description than the C- or being closer to the C- description than the A, and two columns are on the other side of the C-, which represents an attempt that is below the minimum standard or no attempt at all. Rubric use follows the same general algorithm: the student work is first compared against the highest quality level, then if necessary the lowest level, and finally if necessary the work is determined to be closer to one of these levels or the other. The final element of this project involves the training of our teaching assistants to obtain consistent evaluation of student work across all students in the class. This consists of a calibration exercise before the start of the semester, and regular spot-checking by lead teaching assistants during the semester.

In the full paper we will describe our rubric development and implementation process with examples directly from our introductory engineering course (ca. 750 student enrollment in two sections with 15 teaching assistants per section). We will present qualitative and quantitative evidence that the use of rubrics per our methodology results in high grading consistency and timely grade turn-around while also being relatively user-friendly for teaching assistants to implement. Quantitative evidence will include a comparison of inter-rater reliability for course assignments both pre and post-implementation of the streamlined rubric algorithm. We will also present feedback from teaching assistants on the ease-of-use of the new algorithm.

Enszer, J. A., & Buckley, J. M. (2020, June), Algorithm for Consistent Grading in an Introduction to Engineering Course Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--34100

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015