Asee peer logo

Using a Delphi Approach to Develop Rubric Criteria

Download Paper |

Conference

2016 ASEE Annual Conference & Exposition

Location

New Orleans, Louisiana

Publication Date

June 26, 2016

Start Date

June 26, 2016

End Date

August 28, 2016

ISBN

978-0-692-68565-5

ISSN

2153-5965

Conference Session

Assessment I: Developing Assessment Tools

Tagged Division

Educational Research and Methods

Page Count

13

DOI

10.18260/p.27121

Permanent URL

https://peer.asee.org/27121

Download Count

185

Request a correction

Paper Authors

biography

Gayle Lesmond University of Toronto

visit author page

Gayle Lesmond is a Research Assistant in the Department of Mechanical and Industrial Engineering at the University of Toronto. Her area of specialization is rubric development and testing.

visit author page

biography

Nikita Dawe University of Toronto

visit author page

Nikita is a M.A.Sc. candidate in the Department of Mechanical and Industrial Engineering at the University of Toronto. She is completing the Collaborative Program in Engineering Education.

visit author page

biography

Lisa Romkey University of Toronto

visit author page

Lisa Romkey serves as an Associate Professor, Teaching Stream with the Division of Engineering Science at the University of Toronto. In this position, Lisa plays a central role in the evaluation, design and delivery of a dynamic and complex curriculum, while facilitating the development and implementation of various teaching and learning initiatives.

Lisa is cross-appointed with the Department of Curriculum, Teaching and Learning at OISE/UT, and teaches undergraduate courses in engineering & society, and graduate courses in engineering education. Lisa completed an Undergraduate Degree in Environmental Science at the University of Guelph, and a Master’s Degree in Curriculum Studies at the University of Toronto. Research interests include teaching and assessment in engineering education.

visit author page

biography

Susan McCahan University of Toronto

visit author page

Susan McCahan is a Professor in the Department of Mechanical and Industrial Engineering at the University of Toronto. She currently holds the position of Vice Provost, Innovations in Undergraduate Education. She received her B.S. (Mechanical Engineering) from Cornell University, and M.S. and Ph.D. (Mechanical Engineering) from Rensselaer Polytechnic Institute. She is a Fellow of the American Association for the Advancement of Science in recognition of contributions to engineering education has been the recipient of several major teaching and teaching leadership awards including the 3M National Teaching Fellowship and the Medal of Distinction in Engineering Education from Engineers Canada.

visit author page

Download Paper |

Abstract

Recent developments in post-secondary institutions have motivated a shift towards outcomes-based education. A major impetus for this agenda has been the growing need to provide concrete evidence of student learning and institutional effectiveness to various stakeholders. Given this trend, it is important that research be undertaken to explore valid approaches to learning outcomes assessment.

The research described here involves the development of valid, non-discipline specific, analytic rubrics that assess learning outcomes in five key areas: communication, design, teamwork, problem analysis and investigation. This paper reports on the methodology used to complete the first stage of rubric development; identifying the standards through which student work is evaluated. In particular, a two-stage Delphi study was designed to identify rubric criteria for assessing problem analysis and investigation. The Delphi technique is an iterative research tool used to elicit input from a panel of experts on a particular topic. It typically involves a series of virtual survey rounds in which experts offer their views anonymously and have the opportunity to refine them based on controlled feedback from earlier rounds. Panel members include 11 experts for investigation and 15 experts for problem analysis from faculty and staff. In the first round, participants were asked to propose learning outcome statements or “indicators” that are important for assessing problem analysis or investigation. In the second and final round, these responses were arranged by major outcome areas and sent to participants. They were asked to rate how likely they were to use the indicators, and their importance in the curriculum. This research paper focuses on the processes involved in designing and administering a Delphi survey for the purpose of developing tools for learning outcomes assessment, including, expert selection, survey design, and analysis of expert responses. Special attention is paid to the challenges of conducting a Delphi study.

Lesmond, G., & Dawe, N., & Romkey, L., & McCahan, S. (2016, June), Using a Delphi Approach to Develop Rubric Criteria Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.27121

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015