Asee peer logo

Midterm oral exams add value as a predictor of final written exam performance in engineering classes: A multiple regression analysis

Download Paper |

Conference

2022 ASEE Annual Conference & Exposition

Location

Minneapolis, MN

Publication Date

August 23, 2022

Start Date

June 26, 2022

End Date

June 29, 2022

Conference Session

ERM: Let's Talk about Tests! (Tests Part 1)

Page Count

14

DOI

10.18260/1-2--41102

Permanent URL

https://peer.asee.org/41102

Download Count

301

Request a correction

Paper Authors

biography

Minju Kim University of California, San Diego

visit author page

Hello, I am Minju Kim, a PhD Candidate in Experimental Psychology at UC San Diego. I am interested in promoting meaningful learning in engineering classes with research in class designs (e.g. implementing oral exams as assessments, scaffolding the bonding with the teaching team). I am also interested in designing summative and formative assessments that scaffold students with analogical transfer of their knowledge. I am looking for teaching postdoc positions for years 2023-2025 in Psychology and Education in general!

visit author page

author page

Celeste Pilegard University of California, San Diego

biography

Huihui Qi University of California, San Diego

visit author page

Dr. Qi is an Assistant Teaching Professor at University of California, San Diego.

visit author page

biography

Curt Schurgers University of California, San Diego

visit author page

Teaching Professor at UC San Diego

visit author page

biography

Marko Lubarda University of California, San Diego

visit author page

Marko Lubarda is an Assistant Teaching Professor in the Department of Mechanical and Aerospace Engineering at the University of California, San Diego. He teaches mechanics, materials science, design, computational analysis, and engineering mathematics courses, and has co-authored the undergraduate textbook Intermediate Solid Mechanics (Cambridge University Press, 2020). He is dedicated to engineering pedagogy and enriching students' learning experiences through teaching innovations, curriculum design, and support of undergraduate student research.

visit author page

biography

Saharnaz Baghdadchi University of California, San Diego

visit author page

Saharnaz Baghdadchi is an Assistant Teaching Professor at UC San Diego. She is interested in scholarly teaching and uses active learning techniques to help students achieve expert-like level of thinking. She guides students in bridging the gap between facts and usable knowledge to solve complex engineering problems.

visit author page

author page

Alex Phan University of California, San Diego

Download Paper |

Abstract

What is gained when midterm oral exams are implemented in the undergraduate engineering classroom? This research paper examines whether midterm oral exam scores add value above and beyond midterm written exam scores in predicting students’ final written exam scores. The purpose of this study is to evaluate the potential utility of oral exams as formative assessments: if oral exam scores provide additional information beyond written exam scores, they may add meaningful value for students and instructors. The current study investigates this question using data from 10 undergraduate engineering classes (N = 925), representing 6 different courses and 5 different instructors. Though course and exam context differed, all classes implemented a low-stakes midterm oral exam, a midterm written exam, and a midterm final exam. We compared two multiple regression models: a smaller model with only the midterm written exam score as a predictor for the final written exam score, and a full model with both the midterm written exam score and the midterm oral exam score as predictors for the final written exam score. We found that the fuller model with oral exam score was a better fit for our data, indicating that including oral exams explains more variance in students’ final exam performance than midterm written exams alone. Further analyses tentatively indicate that the granularity of the rubric used to score oral exams matters, with finer-grained rubrics more consistently providing predictive value. This study has implications for developing a theory of oral exams, as it leaves room for the possibility that oral exams tap deeper learning processes than written exams. These results show that oral exams provide actionable information instructors can use to make interventions to foster students’ meaningful learning before the end of the term. The quantitative analysis also provides instructors with a simple statistical measure to assess the role of oral exams in student’s learning.

Keywords: oral exams, instructional design, assessment, multiple regression, quantitative analysis

Kim, M., & Pilegard, C., & Qi, H., & Schurgers, C., & Lubarda, M., & Baghdadchi, S., & Phan, A. (2022, August), Midterm oral exams add value as a predictor of final written exam performance in engineering classes: A multiple regression analysis Paper presented at 2022 ASEE Annual Conference & Exposition, Minneapolis, MN. 10.18260/1-2--41102

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2022 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015