Asee peer logo

Multiple-choice Learning Assessments for Intermediate Mechanical Engineering Courses: Insights from Think-aloud Interviews

Download Paper |

Conference

2020 ASEE Virtual Annual Conference Content Access

Location

Virtual On line

Publication Date

June 22, 2020

Start Date

June 22, 2020

End Date

June 26, 2021

Conference Session

Mechanical Engineering Technical Session: Assessment and Accreditation: Making the Grade!

Tagged Division

Mechanical Engineering

Page Count

14

DOI

10.18260/1-2--34990

Permanent URL

https://peer.asee.org/34990

Download Count

604

Paper Authors

biography

Matthew J. Ford Cornell University Orcid 16x16 orcid.org/0000-0002-1053-7149

visit author page

Matthew Ford is currently a Postdoctoral Teaching Specialist working with the Cornell Active Learning Initiative. His background is in solid mechanics.

visit author page

biography

Hadas Ritz Cornell University Orcid 16x16 orcid.org/0000-0002-5396-2962

visit author page

Hadas Ritz is a senior lecturer in Mechanical and Aerospace Engineering, and a Faculty Teaching Fellow at the James McCormick Family Teaching Excellence Institute (MTEI) at Cornell University, where she received her PhD in Mechanical Engineering in 2008. Since then she has taught required and elective courses covering a wide range of topics in the undergraduate Mechanical Engineering curriculum. In her work with MTEI she co-leads teaching workshops for new faculty and assists with other teaching excellence initiatives. Her main teaching interests include solid mechanics and engineering mathematics.

visit author page

author page

Benjamin Finio Cornell University

biography

Elizabeth M. Fisher Cornell University

visit author page

Elizabeth M. Fisher is an Associate Professor in the Sibley School of Mechanical and Aerospace Engineering at Cornell. She received her PhD from U.C. Berkeley.

visit author page

Download Paper |

Abstract

Concept inventories (CIs)—validated tests based on carefully-articulated models of conceptual knowledge in a field—have been developed for many introductory STEM courses such as Physics / Mechanics, Statics, Chemistry, and Electricity and Magnetism. CIs can be powerful research tools for measuring students’ progression towards expert-level thinking, but can be difficult to develop for intermediate courses where domain-specific knowledge, problem-solving strategies, and technical fluency are important learning goals alongside conceptual frameworks. For such intermediate courses, it is still valuable to develop high-quality, multiple-choice tests to measure students’ progress towards course learning objectives or to assess the efficacy of instructional interventions.

We describe the development process and early results for multiple-choice learning assessments (MCLAs), drawn from a mix of pre-existing instruments and original content, for four 300-level mechanical engineering courses taken in the junior year: Fluid Mechanics, Mechanics of Materials, System Dynamics, and Mechatronics. We conducted a series of 16 “think-aloud” interviews with undergraduate students with a range of prior experience with each subject. Think-aloud interviews provide rich, qualitative data about student thought processes which is not available from multiple-choice or even free-response data. Students use a variety of problem-solving strategies including application of memorized formulas, identification with personal experience, mental simulation, strategic elimination, and reverse psychology of the presumed test author. We highlight some challenges in developing MCLAs, including naively-designed questions which admit correct answers with incorrect reasoning, and schematic diagrams which may unintentionally cue irrelevant concepts.

Three of the assessments were delivered as low-stakes quizzes in large-enrollment, lecture-based courses at a private R1 university. Results from the large sample show that many of the student-held misconceptions identified during the interviews are widely held even after instruction. Assessment scores are moderately correlated with final exam scores (0.30 < r < 0.62) and course grades (0.33 < r < 0.49). Data from the pilot tests suggest possible further enhancements to the assessments.

Ford, M. J., & Ritz, H., & Finio, B., & Fisher, E. M. (2020, June), Multiple-choice Learning Assessments for Intermediate Mechanical Engineering Courses: Insights from Think-aloud Interviews Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--34990

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015