Asee peer logo

Objective Scoring Partial Credits by Tracking Failure Cascade in Mechanics Problem Solving

Download Paper |

Conference

2020 ASEE Virtual Annual Conference Content Access

Location

Virtual On line

Publication Date

June 22, 2020

Start Date

June 22, 2020

End Date

June 26, 2021

Conference Session

Grading and Feedback Models in Mechanics

Tagged Division

Mechanics

Page Count

17

DOI

10.18260/1-2--35005

Permanent URL

https://peer.asee.org/35005

Download Count

486

Request a correction

Paper Authors

biography

Andrew Dongjin Kim Georgia State University

visit author page

Andrew Dongjin Kim is an Assistant Professor of Computer Science and Engineering at Georgia State University.

visit author page

Download Paper |

Abstract

Consistent and objective grading open-ended questions for mechanics course problems is a challenge, especially in terms of offering partial credits to failures arising from mistakes. In general, there is no standard grading rubric for failures, and in most cases, it solely depends on each instructor’s decision as well as each question’s level of difficulty and the length of its solution. This study aimed at finding an alternative way that assesses students’ integrative interpretation-planning-execution level in solving the open-ended questions as well as objectively tracks students’ failures that cascade incorrect results. One method proposed here is using a variable-based question set. Differently from traditional open-ended questions, the proposed model consists of a sub-question set - the sub-questions are listed in a sequence of suggested solving procedure such that if a student successfully solves the sub-questions s/he will be able to reach the final answer that was ultimately requested to be determined by the main question statement. This indicates that if a student could not solve the previous sub-question correctly, that error will cascade toward the following sub-questions. The variable-based sub-question sequence was designed as it was considered as the possible finest level of the question set. Error cascade was tracked by plugging student’s answers from the previous steps in the equation for the current step. A MATLAB script was used to repeat this failure tracking for all sub-questions. If the student’s answer for the current step was the same as the result from the failure tracking subroutine, it was considered as the student has obtained the incorrect answer due to the calculation error that occurred during the previous step, and thus the deducted points were returned. Otherwise, the failure was considered as an individual calculation error or the result of misconception. Collected two years assessment results at Georgia State University’s engineering courses were examined, and the statistics revealed that average 60% (± 22%) of students were at the risk of losing points due to error cascade; average 10% points (± 6% points) were detected as deducted points due to the effect of failure cascade, and more number of the sub-questions set could detect more cascaded errors. The algorithm applied in this study can also be deployed to automatic grading along with providing standardized feedback messages.

Kim, A. D. (2020, June), Objective Scoring Partial Credits by Tracking Failure Cascade in Mechanics Problem Solving Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--35005

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015