Asee peer logo

Systematic Analysis of Formative Feedback, Focus on Electrical Engineering Assessments

Download Paper |

Conference

2019 ASEE Annual Conference & Exposition

Location

Tampa, Florida

Publication Date

June 15, 2019

Start Date

June 15, 2019

End Date

June 19, 2019

Conference Session

Assessment of Learning in ECE Courses

Tagged Division

Electrical and Computer

Page Count

10

DOI

10.18260/1-2--33334

Permanent URL

https://peer.asee.org/33334

Download Count

336

Request a correction

Paper Authors

biography

Bahar Memarian University of Toronto

visit author page

Bahar Memarian is a PhD candidate in Industrial Engineering and the Collaborative Specialization in Engineering Education at University of Toronto, Canada. Her primary research interests are in Human Factors and Systems Engineering, specifically their application in Education (e.g. learning outcomes assessment, engineering problem solving). Before that, she completed her MASc. (2015) and BASc. (2012) in Electrical Engineering from University of Toronto.

visit author page

biography

Susan McCahan University of Toronto

visit author page

Susan McCahan is a Professor in the Department of Mechanical and Industrial Engineering at the University of Toronto. She currently holds the positions of Vice-Provost, Innovations in Undergraduate Education and Vice-Provost, Academic Programs. She received her B.S. (Mechanical Engineering) from Cornell University, and M.S. and Ph.D. (Mechanical Engineering) from Rensselaer Polytechnic Institute. She is a Fellow of the American Association for the Advancement of Science in recognition of contributions to engineering education has been the recipient of several major teaching and teaching leadership awards including the 3M National Teaching Fellowship and the Medal of Distinction in Engineering Education from Engineers Canada.

visit author page

Download Paper |

Abstract

This will be a complete research paper. In the assessment of closed-ended engineering problem solving tasks, a challenge often faced by the instructor is to ensure delivery of consistent and reliable feedback to students. Increasing the level of detail in a grading scheme is a common technique cited in the “assessment for learning” literature to increase assessment reliability in grading and feedback delivery. This is referred to as the grading grain or granularity, and the use of this approach motivated us to examine whether a finer grained grading scheme actually improves the observable quality of feedback given by teaching assistants on electrical engineering tests.

Graded student test papers from three electrical engineering courses were used for this study. For each course, 7 unique problems were selected. For each problem, we randomly selected between 15 and 26 graded student solutions that represented the full range of performance levels. We also acquired from the instructor the associated solution guide (i.e. the correct solution) and grading scheme. Every problem was characterized in terms of its granularity and complexity.

Granularity and complexity served as the independent variables in this work. The definitions for these are derived from literature in the field and described in detail in the full paper. Our hypothesis was that increasing the granularity of the solution guide, and/or increasing the complexity of the problem would improve the quality of the feedback. The goal was to analyze the quality of feedback across test problems having varying grading granularity and level of problem complexity.

The feedback marks on the student papers were classified and coded. The categories included: • Validating: e.g. check-marks • Flagging: e.g. error identification such as cross-marks • Penalizing: e.g. grade deductions • Tagging: e.g. reference to a feedback framework, scheme, or rubric • Elaborating: e.g. free-form textual feedback

The data shows a positive correlation between problem complexity and feedback quality, but no relationship between the granularity of the solution guide and the feedback quality. The findings of this study are substantial as they reveal that it is not always the case that the level of grain in a grading scheme improves assessors’ feedback quality, as the literature presumes. The findings also demonstrate that the problem’s complexity can significantly influence the way assessors provide feedback to students in the context of electrical engineering problem solving tasks. The results may suggest that to improve the quality of feedback to students in electrical engineering courses, more time and attention should be spent on the quality of the problems given to the students rather than developing finer grained grading schemes.

Memarian, B., & McCahan, S. (2019, June), Systematic Analysis of Formative Feedback, Focus on Electrical Engineering Assessments Paper presented at 2019 ASEE Annual Conference & Exposition , Tampa, Florida. 10.18260/1-2--33334

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2019 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015