Vancouver, BC
June 26, 2011
June 26, 2011
June 29, 2011
2153-5965
Educational Research and Methods
26
22.647.1 - 22.647.26
10.18260/1-2--17928
https://peer.asee.org/17928
563
Mark Carnes is a licensed Professional Engineer (PE) and is currently a doctoral student and a future faculty fellow in the School of Engineering Education at Purdue University. Before coming to Purdue, he spent over 30 years as an electronics designer of control and power conversion circuits. He received an M.S. from the University of Michigan (1982) and a B.S. from the University of Notre Dame (1975), both in Electrical Engineering.
Heidi Diefes-Dux is an Associate Professor in the School of Engineering Education at Purdue University. She received her B.S. and M.S. in Food Science from Cornell University and her Ph.D. in Food Process Engineering from the Department of Agricultural and Biological Engineering at Purdue University. Since 1999, she has been a faculty member in Purdue’s First-Year Engineering Program, the gateway for all first-year students entering the College of Engineering. She is currently the Director of Teacher Professional Development for the Institute for P-12 Engineering Research and Learning (INSPIRE). Her research interests center on implementation and assessment of mathematical modeling problems.
Monica E. Cardella is an Assistant Professor of Engineering Education and is the Co-Director of Assessment Research for the Institute for P-12 Engineering Research and Learning (INSPIRE) at Purdue University. Dr. Cardella earned a B.Sc. in Mathematics from the University of Puget Sound and an M.S. and Ph.D. in Industrial Engineering at the University of Washington. At the University of Washington she worked with the Center for Engineering Learning and Teaching (CELT) and the LIFE Center (Learning in Informal and Formal Environments). She was a CASEE Postdoctoral Engineering Education Researcher at the Center for Design Research at Stanford before beginning her appointment at Purdue. Her research interests include: learning in informal and out-of-school time settings, pre-college engineering education, design thinking, mathematical thinking, and assessment research.
Evaluating Student Responses in Open-Ended Problems Involving Iterative Solution DevelopmentOpen-ended problems are an important part of the engineering curriculum because, when welldesigned, they closely resemble problem-solving situations students will encounter asprofessional engineers. However, valid and reliable evaluation of student performance on open-ended problems is a challenge given that numerous reasonable responses are likely to exist for agiven problem and multiple instructors and peers may be evaluating student work. In previouswork, evaluation tools (including a rubric, task-specific supports, and scorer training) for open-ended problems, specifically Model-Eliciting Activities (MEAs), were developed using threeeducational research perspectives: the models and modeling perspective, design researchmethodology, and multi-tiered teaching experiment methodology. This ensured that theevaluation tools evolve with fidelity to the characteristics of high performance that professionalscare about and with increased reliability.One of the features of the MEA implementation that models real-world engineering problemsolving is the iterative nature of solution development. This study uses the previously developedevaluation tools to examine the changes the student teams made between solution iterations thatwould indicate that they were making progress towards higher quality solutions over the courseof these iterations. The Just-in-Time Manufacturing MEA was implemented in spring 2009 in alarge first-year engineering course. Each student team developed three versions of their solution:First Draft, Second Draft, and Team Final. Between the versions, the students received peerfeedback, experienced problem updates in the form of different/additional data, and received TAfeedback. From all student team work submitted, a sample of 50 teams was randomly selectedfor this study. Each of the teams’ three versions was rigorously scored and coded by anengineering expert using the previously developed evaluation tools which included a fourdimension generic MEA Rubric and JIT MEA specific assessment supports. The fourdimensions were: quality of the mathematical model, share-ability, re-usability, andmodifiability. The expert scores were then compared across the three versions, showing apositive trend of improvement in student performance across all dimensions. The largest gainswere seen in the aspect of share-ability which refers to the clarity of the actual procedure, suchthat the user can easily replicate results. These findings have implications for instruction alongeach dimension. These findings also provide opportunities to investigate the nature of peer andTA feedback that may (or may not) have resulted in change on a given iteration.
Carnes, M. T., & Diefes-Dux, H. A., & Cardella, M. E. (2011, June), Evaluating Student Responses in Open-Ended Problems Involving Iterative Solution Development in Model-Eliciting Activities Paper presented at 2011 ASEE Annual Conference & Exposition, Vancouver, BC. 10.18260/1-2--17928
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2011 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015