Austin, Texas
June 14, 2009
June 14, 2009
June 17, 2009
2153-5965
Educational Research and Methods
18
14.742.1 - 14.742.18
10.18260/1-2--5453
https://peer.asee.org/5453
378
Monica Cardella is an Assistant Professor of Engineering Education at Purdue University. She received her B.S. in Mathematics from the University of Puget Sound and her M.S. and Ph.D. in Industrial Engineering from the University of Washington. She teaches in the First-Year Engineering Program at Purdue as well as the Interdisciplinary Engineering program. Her research interests include engineers' uses of mathematical thinking in conceptual design as well as qualitative research in engineering education.
Heidi Diefes-Dux is an Associate Professor in the School of Engineering Education at Purdue University. She received her B.S. and M.S. in Food Science from Cornell University and her Ph.D. in Food Process Engineering from the Department of Agricultural and Biological Engineering at Purdue University. Since 1999, she has been a faculty member within the First-Year Engineering Program at Purdue. She coordinated (2000-2006) and continues to teach in the required first-year engineering problem solving and computer tools course. Her research focuses on the development, implementation, and assessment of model-eliciting activities with realistic engineering contexts.
Amber Oliver is pursuing her Masters of Science in Human Resource Management in the Krannert School of Management at Purdue University. She received her B.S. in Organizational Communication with a concentration in Human Factors Engineering from Purdue University.
Matthew Verleger is a doctoral candidate in the School of Engineering Education at Purdue University. He received his B.S. in Computer Engineering and his M.S. in Agricultural and Biological Engineering, both from Purdue University. His research interests are on how students develop mathematical modeling skills through the use of model-eliciting activities and in peer review as a pedagogical tool.
Insights into the Process of Providing Feedback to Students on Open-Ended Problems Keywords: Feedback, Open-ended problems, Teaching Assistants
Abstract
One of the challenges of implementing open-ended problems is assessing students’ responses, as the open-ended nature of the problems allow for multiple appropriate, “good” responses. In particular, formative assessment—giving the students feedback on intermediate solutions—can be particularly challenging when it is hoped that students will understand and respond to the feedback in ways that indicate learning has taken place. This study is part of a larger project that focuses on the feedback that the students are given as they iterate through multiple drafts of their solutions to Model-Eliciting Activities (MEAs). In this paper, we report on findings related to Graduate Teaching Assistants’ experiences in providing their students with feedback. Two cases are presented: the experiences of a Teaching Assistant who is new to MEAs, and therefore new to the process of giving feedback on MEA solutions, and the experiences of a more experienced Teaching Assistant.
I. Introduction
Engineering educators nationwide as well as globally recognize the need for students to develop teaming and communication skills, proficiency in engineering science and design, as well as an ability to address open-ended problems replete with ambiguity and uncertainty1. One instructional approach to developing these competencies is the use of open-ended, realistic, client-drive problems called Model-Eliciting Activities2. This approach has been used with first year engineering students3,4, as well as upper-level engineering students5. One of the challenges in adopting this approach, however, is assessing students’ responses to the Model-Eliciting Activities, as the open-ended nature of the problems allow for multiple appropriate, “good” responses. In particular, formative assessment—giving the students feedback on intermediate solutions—can be particularly challenging as it is hoped that students will use this feedback to gain new insights into the problem they are solving and produce a higher quality solution in the next iteration.
Model-Eliciting Activities (MEAs)
Model-Eliciting Activities (MEAs) are client-driven, open-ended problems that are constructed using six principles for designing MEAs6 that have been modified for engineering contexts7,2. The intention is to construct realistic engineering problems that (1) require student teams to develop mathematical models for clients and (2) provide a natural window on students’ thinking about the mathematics in the problem context. That is, the problems are “model-eliciting” and “thought-revealing”6. Students’ solutions to these problems are generalizable mathematical models – meaning the models are shareable, modifiable, and reusable tools6. To develop a generalizable mathematical model for a client, students must draw on and make new sense of
Cardella, M., & Diefes-Dux, H., & Oliver, A., & Verleger, M. (2009, June), Insights Into The Process Of Providing Feedback To Students On Open Ended Problems Paper presented at 2009 Annual Conference & Exposition, Austin, Texas. 10.18260/1-2--5453
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2009 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015