Asee peer logo

Assessing the Reliability of a Chemical Engineering Problem-solving Rubric when Using Multiple Raters

Download Paper |

Conference

2019 ASEE Annual Conference & Exposition

Location

Tampa, Florida

Publication Date

June 15, 2019

Start Date

June 15, 2019

End Date

June 19, 2019

Conference Session

ERM Technical Session 5: Assessment

Tagged Division

Educational Research and Methods

Page Count

15

Permanent URL

https://peer.asee.org/32124

Download Count

5

Request a correction

Paper Authors

biography

Timothy Ryan Duckett Acumen Research and Evaluation, LLC Orcid 16x16 orcid.org/0000-0001-8060-6149

visit author page

T. Ryan Duckett is a research associate with Acumen Research and Evaluation, LLC., a program evaluation and grant writing company that specializes in STEM and early childhood education. He is a PhD student in the Research and Measurement department at the University of Toledo.

visit author page

biography

Matthew W. Liberatore University of Toledo Orcid 16x16 orcid.org/0000-0002-5495-7145

visit author page

Matthew W. Liberatore is a Professor of Chemical Engineering at the University of Toledo. He earned a B.S. degree from the University of Illinois at Chicago and M.S. and Ph.D. degrees from the University of Illinois at Urbana-Champaign, all in chemical engineering. His current research involves the rheology of complex fluids as well as active learning, reverse engineering online videos, and interactive textbooks. His website is: http://www.utoledo.edu/engineering/chemical-engineering/liberatore/

visit author page

biography

Uchenna Asogwa University of Toledo

visit author page

Uchenna Asogwa is a graduate student of Chemical Engineering at the University of Toledo. He earned a B.S. degree from the University of Benin, Nigeria in chemical engineering. His current research involves the reverse engineering online videos as well as rheology of complex fluids.

visit author page

biography

Gale A. Mentzer Acumen Research and Evaluation

visit author page

Gale A. Mentzer, PhD, the owner and director of Acumen Research and Evaluation, has been a professional program evaluator since 1998. She holds a PhD in Educational Research and Measurement from The University of Toledo and a Master of Arts in English Literature and Language—a unique combination of specializations that melds quantitative and qualitative methodologies. She and has extensive experience in the evaluation of projects focused on STEM education including evaluations of several multi-million dollar federally funded projects. Previously she taught graduate level courses for the College of Education at The University of Toledo in Statistics, Testing and Grading, Research Design, and Program Evaluation.

visit author page

biography

Amanda Portis Malefyt Trine University

visit author page

Amanda Malefyt is currently Chair and Associate professor in the McKetta Department of Chemical and Bioprocess Engineering at Trine University. She received her bachelor’s degree from Trine (formerly Tri-State) University and Ph.D. from Michigan State University. Her research interests include engineering education and nucleic acid therapeutics.

visit author page

Download Paper |

Abstract

This evidence-based practices paper discusses the method employed in validating the use of a project modified version of the PROCESS tool (Grigg, Van Dyken, Benson, & Morkos, 2013) for measuring student problem solving skills. The PROCESS tool allows raters to score students’ ability in the domains of Problem definition, Representing the problem, Organizing information, Calculations, Evaluating the solution, Solution communication, and Self-assessment. Specifically, this research compares student performance on solving traditional textbook problems with novel, student-generated learning activities (i.e. reverse engineering videos in order to then create their own homework problem and solution). The use of student-generated learning activities to assess student problem solving skills has theoretical underpinning in Felder’s (1987) work of “creating creative engineers,” as well as the need to develop students’ abilities to transfer learning and solve problems in a variety of real world settings. In this study, four raters used the PROCESS tool to score the performance of 70 students randomly selected from two undergraduate chemical engineering cohorts at two Midwest universities. Students from both cohorts solved 12 traditional textbook style problems and students from the second cohort solved an additional nine student-generated video problems.

Any large scale assessment where multiple raters use a rating tool requires the investigation of several aspects of validity. The many-facets Rasch measurement model (MFRM; Linacre, 1989) has the psychometric properties to determine if there are any characteristics other than “student problem solving skills” that influence the scores assigned, such as rater bias, problem difficulty, or student demographics. Before implementing the full rating plan, MFRM was used to examine how raters interacted with the six items on the modified PROCESS tool to score a random selection of 20 students’ performance in solving one problem. An external evaluator led “inter-rater reliability” meetings where raters deliberated rationale for their ratings and differences were resolved by recourse to Pretz, et al.’s (2003) problem-solving cycle that informed the development of the PROCESS tool. To test the new understandings of the PROCESS tool, raters were assigned to score one new problem from a different randomly selected group of six students. Those results were then analyzed in the same manner as before. This iterative process resulted in substantial increases in reliability, which can be attributed to increased confidence that raters were operating with common definitions of the items on the PROCESS tool and rating with consistent and comparable severity. This presentation will include examples of the student-generated problems and a discussion of common discrepancies and solutions to the raters’ initial use of the PROCESS tool. Findings as well as the adapted PROCESS tool used in this study can be useful to engineering educators and engineering education researchers.

Duckett, T. R., & Liberatore, M. W., & Asogwa, U., & Mentzer, G. A., & Malefyt, A. P. (2019, June), Assessing the Reliability of a Chemical Engineering Problem-solving Rubric when Using Multiple Raters Paper presented at 2019 ASEE Annual Conference & Exposition , Tampa, Florida. https://peer.asee.org/32124

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2019 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015