Asee peer logo

User Testing with Assessors to Develop Universal Rubric Rows for Assessing Engineering Design

Download Paper |


2016 ASEE Annual Conference & Exposition


New Orleans, Louisiana

Publication Date

June 26, 2016

Start Date

June 26, 2016

End Date

August 28, 2016





Conference Session

Student Evaluation in Design Education

Tagged Division

Design in Engineering Education

Page Count




Permanent URL

Download Count


Request a correction

Paper Authors


Nikita Dawe University of Toronto

visit author page

Nikita is a M.A.Sc. candidate in the Department of Mechanical and Industrial Engineering at the University of Toronto. She is completing the Collaborative Program in Engineering Education.

visit author page


Lisa Romkey University of Toronto

visit author page

Lisa Romkey serves as an Associate Professor, Teaching Stream with the Division of Engineering Science at the University of Toronto. In this position, Lisa plays a central role in the evaluation, design and delivery of a dynamic and complex curriculum, while facilitating the development and implementation of various teaching and learning initiatives.

Lisa is cross-appointed with the Department of Curriculum, Teaching and Learning at OISE/UT, and teaches undergraduate courses in engineering & society, and graduate courses in engineering education. Lisa completed an Undergraduate Degree in Environmental Science at the University of Guelph, and a Master’s Degree in Curriculum Studies at the University of Toronto. Research interests include teaching and assessment in engineering education.

visit author page


Susan McCahan University of Toronto

visit author page

Susan McCahan is a Professor in the Department of Mechanical and Industrial Engineering at the University of Toronto. She currently holds the position of Vice Provost, Innovations in Undergraduate Education. She received her B.S. (Mechanical Engineering) from Cornell University, and M.S. and Ph.D. (Mechanical Engineering) from Rensselaer Polytechnic Institute. She is a Fellow of the American Association for the Advancement of Science in recognition of contributions to engineering education has been the recipient of several major teaching and teaching leadership awards including the 3M National Teaching Fellowship and the Medal of Distinction in Engineering Education from Engineers Canada.

visit author page


Gayle Lesmond University of Toronto

visit author page

Gayle Lesmond is a Research Assistant in the Department of Mechanical and Industrial Engineering at the University of Toronto. Her area of specialization is rubric development and testing.

visit author page

Download Paper |


This paper describes the process of testing and refining modular rubric rows developed for the assessment of engineering design activities. This is one component of a larger project to develop universal analytic rubrics for valid and reliable competency assessment across different academic disciplines and years of study. The project is being undertaken by researchers based in the faculty of applied science and engineering at a large research-intensive public university. From January 2014 to June 2015, we defined and validated indicators (criteria) for design and communication learning outcomes, then created descriptors for each indicator to discriminate between four levels of performance: Fails, Below, Meets, and Exceeds graduate expectations. From this bank of modular rubric items, applicable rows can be selected and compiled to produce a rubric tailored to a particular assessment activity. Here we discuss these rubrics within the larger context of the assessment of engineering design. We tested draft rubrics in focus group sessions with assessors. We followed the testing with structured discussions to elicit feedback on the quality and usability of these rubrics, and to investigate how the assessors interpreted the language used in the indicators and descriptors. We asked participants to identify indicators they believed were irrelevant, redundant, or missing from the rubric. We also asked them to identify and discuss indicators and descriptors that were confusing. Finally, we asked them what changes they would recommend and what training materials they would find useful when using rubrics of this design. By transcribing, coding, and analyzing recordings of these discussions, we identified rubric content that is unclear, ambiguous, or confusing to assessors and synthesized their recommendations for making the rubrics more usable. While some rubric rows received similar criticism from most participants, we identified many differences in assessors' rubric design preferences and in how they apply rubrics to evaluate student work. For example, some participants stated that the level of specificity in the indicators and descriptors made it more difficult to select a performance level. This feedback is surprising as it seems to contradict claims in the literature that providing more specific descriptions of quality makes rating more consistent. It also emerged that assessors have different conceptions of engineering design and the design process, and are confused when presented with unfamiliar terminology. We will be applying this information to refine the rubrics and develop accompanying training materials. The improved rubric items will be evaluated for inter-rater reliability and deployed in academic courses. We will also perform user testing with undergraduate students.

Dawe, N., & Romkey, L., & McCahan, S., & Lesmond, G. (2016, June), User Testing with Assessors to Develop Universal Rubric Rows for Assessing Engineering Design Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.27118

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015