June 26, 2011
June 26, 2011
June 29, 2011
Educational Research and Methods
22.647.1 - 22.647.26
Evaluating Student Responses in Open-Ended Problems Involving Iterative Solution DevelopmentOpen-ended problems are an important part of the engineering curriculum because, when welldesigned, they closely resemble problem-solving situations students will encounter asprofessional engineers. However, valid and reliable evaluation of student performance on open-ended problems is a challenge given that numerous reasonable responses are likely to exist for agiven problem and multiple instructors and peers may be evaluating student work. In previouswork, evaluation tools (including a rubric, task-specific supports, and scorer training) for open-ended problems, specifically Model-Eliciting Activities (MEAs), were developed using threeeducational research perspectives: the models and modeling perspective, design researchmethodology, and multi-tiered teaching experiment methodology. This ensured that theevaluation tools evolve with fidelity to the characteristics of high performance that professionalscare about and with increased reliability.One of the features of the MEA implementation that models real-world engineering problemsolving is the iterative nature of solution development. This study uses the previously developedevaluation tools to examine the changes the student teams made between solution iterations thatwould indicate that they were making progress towards higher quality solutions over the courseof these iterations. The Just-in-Time Manufacturing MEA was implemented in spring 2009 in alarge first-year engineering course. Each student team developed three versions of their solution:First Draft, Second Draft, and Team Final. Between the versions, the students received peerfeedback, experienced problem updates in the form of different/additional data, and received TAfeedback. From all student team work submitted, a sample of 50 teams was randomly selectedfor this study. Each of the teams’ three versions was rigorously scored and coded by anengineering expert using the previously developed evaluation tools which included a fourdimension generic MEA Rubric and JIT MEA specific assessment supports. The fourdimensions were: quality of the mathematical model, share-ability, re-usability, andmodifiability. The expert scores were then compared across the three versions, showing apositive trend of improvement in student performance across all dimensions. The largest gainswere seen in the aspect of share-ability which refers to the clarity of the actual procedure, suchthat the user can easily replicate results. These findings have implications for instruction alongeach dimension. These findings also provide opportunities to investigate the nature of peer andTA feedback that may (or may not) have resulted in change on a given iteration.
Carnes, M. T., & Diefes-Dux, H. A., & Cardella, M. E. (2011, June), Evaluating Student Responses in Open-Ended Problems Involving Iterative Solution Development in Model-Eliciting Activities Paper presented at 2011 ASEE Annual Conference & Exposition, Vancouver, BC. 10.18260/1-2--17928
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2011 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015