Vancouver, BC
June 26, 2011
June 26, 2011
June 29, 2011
2153-5965
Educational Research and Methods
22
22.1339.1 - 22.1339.22
10.18260/1-2--18515
https://peer.asee.org/18515
453
Amanda Fry is a doctoral candidate in Art Education at Purdue University. She received her B.S. in Art Education from Indiana State University and her M.A. in Art Education from Purdue University. Her research interests include qualitative research in engineering education and investigating the effects of an instructional model in which academically struggling secondary students mentor elementary students in the creation of artwork as a means of improving their academic performance.
Monica E. Cardella is an Assistant Professor of Engineering Education and is the Co-Director of Assessment Research for the Institute for P-12 Engineering Research and Learning (INSPIRE) at Purdue University. Dr. Cardella earned a B.Sc. in Mathematics from the University of Puget Sound and an M.S. and Ph.D. in Industrial Engineering at the University of Washington. At the University of Washington she worked with the Center for Engineering Learning and Teaching (CELT) and the LIFE Center (Learning in Informal and Formal Environments). She was a CASEE Postdoctoral Engineering Education Researcher at the Center for Design Research at Stanford before beginning her appointment at Purdue. Her research interests include: learning in informal and out-of-school time settings, pre-college engineering education, design thinking, mathematical thinking, and assessment research.
Heidi Diefes-Dux is an Associate Professor in the School of Engineering Education at Purdue University. She received her B.S. and M.S. in Food Science from Cornell University and her Ph.D. in Food Process Engineering from the Department of Agricultural and Biological Engineering at Purdue University. Since 1999, she has been a faculty member in Purdue’s First-Year Engineering Program, the gateway for all first-year students entering the College of Engineering. She is currently the Director of Teacher Professional Development for the Institute for P-12 Engineering Research and Learning (INSPIRE). Her research interests center on implementation and assessment of mathematical modeling problems.
Student Responses to and Perceptions of Feedback Received on a Series of Model-Eliciting Activities: A Case Study One challenge in implementing open-ended problems is assessing students’ responsesbecause the open-ended nature of the problems allow for numerous suitable, “good” responses.Specifically, formative assessment- providing the students with feedback on intermittentsolutions- can be especially challenging when it is hoped that students will understand andrespond to the feedback in ways that indicate learning has taken place. The aim of this study is to examine how students perceive and respond to feedbackreceived from a Teaching Assistant (TA) and their peers. This study is part of a larger projectthat focuses on the feedback the students receive as they iterate through multiple drafts of theirsolutions to Model-Eliciting Activities (MEAs). MEAs are open-ended problems requiringstudents to work in teams to develop mathematical models and communicate theirrecommendations to a fictitious client. In this paper, we report findings based upon three interviews the students participated infollowing three MEAs implemented in a single semester. Data analysis consisted of coding theinterviews using an open coding scheme. Frequently used codes were examined to determinerecurring themes in the data. The cases presented are four students belonging to the same teamwho received the same TA and peer feedback on three MEA solutions they created. Eventhough the students created their MEA solutions together and received their TA and peerfeedback as a group, they did not always view the feedback received in the same manner. Findings indicated all four students struggled with the feedback received from their peers.Peer feedback was often not helpful and was sometimes ignored. The students also agreed thequality of peer feedback received deteriorated over the three MEAs. There were differences inthe perception of which portions of the peer feedback were the most helpful. The studentsagreed TA feedback was helpful in bettering their MEA solutions and was more useful than thepeer feedback. However, the students had contradictory perceptions of the level of specificityand vagueness in the TA feedback. Findings from this study suggest TA feedback should be asspecific to each MEA solution as possible. In addition, the peer feedback process requiresadditional examination to improve the quality of feedback given. This study supports the notionthat students need training and education both in how to give feedback as well as how to respondto feedback.
Fry, A. S., & Cardella, M. E., & Diefes-Dux, H. A. (2011, June), Student Responses to and Perceptions of Feedback Received on a Series of Model-Eliciting Activities: A Case Study Paper presented at 2011 ASEE Annual Conference & Exposition, Vancouver, BC. 10.18260/1-2--18515
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2011 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015