Asee peer logo

Analysis of Feedback Quality on Engineering Problem-solving Tasks

Download Paper |

Conference

2019 ASEE Annual Conference & Exposition

Location

Tampa, Florida

Publication Date

June 15, 2019

Start Date

June 15, 2019

End Date

June 19, 2019

Conference Session

ERM Technical Session 5: Assessment

Tagged Division

Educational Research and Methods

Page Count

13

Permanent URL

https://peer.asee.org/32086

Download Count

12

Request a correction

Paper Authors

biography

Bahar Memarian University of Toronto

visit author page

Bahar Memarian is a PhD candidate in Industrial Engineering and the Collaborative Specialization in Engineering Education at University of Toronto, Canada. Her primary research interests are in Human Factors and Systems Engineering, specifically their application in Education (e.g. learning outcomes assessment, engineering problem solving). Before that, she completed her MASc. (2015) and BASc. (2012) in Electrical Engineering from University of Toronto.

visit author page

biography

Susan McCahan University of Toronto

visit author page

Susan McCahan is a Professor in the Department of Mechanical and Industrial Engineering at the University of Toronto. She currently holds the positions of Vice-Provost, Innovations in Undergraduate Education and Vice-Provost, Academic Programs. She received her B.S. (Mechanical Engineering) from Cornell University, and M.S. and Ph.D. (Mechanical Engineering) from Rensselaer Polytechnic Institute. She is a Fellow of the American Association for the Advancement of Science in recognition of contributions to engineering education has been the recipient of several major teaching and teaching leadership awards including the 3M National Teaching Fellowship and the Medal of Distinction in Engineering Education from Engineers Canada.

visit author page

Download Paper |

Abstract

This is a research paper. The types of feedback provided to students on engineering problem solving tasks are examined in this study. The assessment for learning conceptual framework is adopted, which suggests that assessment tasks should further learning rather than being only summative. In this work the feedback observed on marked midterm tests and final exam papers is coded to investigate whether the feedback aligns with an assessment for learning approach. The types of feedback on the papers was characterized using a hierarchical schema with check marks (basic validating feedback) being the least effective, and textual comments (elaborating feedback) being the most effective. The proposed classification is then used to code graded student test papers (naturalistic data) from three electrical engineering courses. The data includes 7 problems from each course, leading to 21 engineering problems in total. Between 16 and 27 graded student solutions are randomly selected for analysis for each problem. Descriptive and statistical data analysis is carried out. The results demonstrate that poor quality student solutions receive less, and less valuable feedback than high quality student work. The results also exhibit a high degree of variability between types of feedback provided on student work. The findings of this study are useful in informing instructional design and changes to assessment practices.

Memarian, B., & McCahan, S. (2019, June), Analysis of Feedback Quality on Engineering Problem-solving Tasks Paper presented at 2019 ASEE Annual Conference & Exposition , Tampa, Florida. https://peer.asee.org/32086

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2019 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015