Asee peer logo

The benefits of writing machine-graded final exams to be capable of more nuanced feedback in large foundational mechanics courses.

Download Paper |

Conference

2022 ASEE Annual Conference & Exposition

Location

Minneapolis, MN

Publication Date

August 23, 2022

Start Date

June 26, 2022

End Date

June 29, 2022

Conference Session

Assessment in Mechanics Courses

Page Count

12

DOI

10.18260/1-2--40534

Permanent URL

https://peer.asee.org/40534

Download Count

235

Paper Authors

biography

James Lord Virginia Polytechnic Institute and State University

visit author page

James earned a Ph.D. in Biomechanical Engineering from Newcastle University, England in 2012 for his work on metal-on-metal hip prostheses.

He works as a collegiate assistant professor in the Department of Biomedical Engineering and Mechanics at Virginia Tech, where he coordinates and teaches introductory courses in statics and mechanics of materials. Research interests include pedagogy and policy for large introductory mechanics classes, assessment measures of both students and faculty, and the effects on student learning of increased reliance on teaching-faculty without tenure.

visit author page

Download Paper |

Abstract

We discuss an approach to multiple-choice exams that awards partial credit to students who make minor common mistakes when calculating their numerical solutions, in order to promote more nuanced feedback and grading. Assessing student performance in large foundational engineering courses can be challenging. This is especially true for summative assessment which is commonly conducted with a cumulative final exam at the end of the semester. Foundational courses often have large numbers of students across multiple sections, and grades for final exams usually need to be returned quickly. It is challenging to create comprehensive exams that appropriately test students’ understanding of the material and which can be graded within 1 – 2 days. One approach is the use of machine-graded multiple-choice exams that are common to all students in every section of the course.

However, these exams suffer from issues of their own. In particular, multiple-choice exams do not differentiate between a student who was not even able to start a problem and a student who did everything correctly but made a minor mathematical error. Unlike hand-graded exams, it is not trivial to award partial credit in a multiple-choice exam. We explore an attempt to write multiple-choice exams where partial credit can be earned.

Three large foundational engineering courses were selected: Statics (602 students across 11 sections); Mechanics of Deformable Bodies (MDB) (158 students across 4 sections); and Basic Principles of Structures (BPOS) (152 students in one section). In the Fall 2021 semester, the final exams were written such that approximately half of the questions (Statics = 11/20, MDB = 11/22, BPOS = 8/15) had a dummy answer that would be found if a student made a common mistake on the problem. The exams were administered at the end of the semester and machine graded as normal, but the ‘common mistake’ answers were awarded 50% credit.

Mean student scores increased from 66.3% to 70.8% in Statics, from 49.7% to 54.5% in MDB, and from 57.9% to 64.4% in BPOS. We also saw a significant increase in the mean number of students receiving at least some credit for questions where partial credit was available, from 61.8% to 78.6% in Statics, from 46.2% to 65.8% in MDB, and from 52.6% to 74.3% in BPOS.

We observed a small increase in Cronbach’s alpha from 0.848 to 0.860 in Statics, from 0.867 to 0.878 in MDB, and from 0.785 to 0.802 in BPOS. There was a positive correlation between student scores on the multiple-choice final exam and on their constructed-response midterm exams. These correlations did not change significantly when partial credit was applied. This method of awarding partial credit on multiple choice exams shows some promise and, for our specific needs, has certain advantages over other methods, including the ability to require numeric answers, minimal change to the existing exam format, and only a small increase in time to create the exam.

Lord, J. (2022, August), The benefits of writing machine-graded final exams to be capable of more nuanced feedback in large foundational mechanics courses. Paper presented at 2022 ASEE Annual Conference & Exposition, Minneapolis, MN. 10.18260/1-2--40534

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2022 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015