Virtual On line
June 22, 2020
June 22, 2020
June 26, 2021
Concept inventories (CIs)—validated tests based on carefully-articulated models of conceptual knowledge in a field—have been developed for many introductory STEM courses such as Physics / Mechanics, Statics, Chemistry, and Electricity and Magnetism. CIs can be powerful research tools for measuring students’ progression towards expert-level thinking, but can be difficult to develop for intermediate courses where domain-specific knowledge, problem-solving strategies, and technical fluency are important learning goals alongside conceptual frameworks. For such intermediate courses, it is still valuable to develop high-quality, multiple-choice tests to measure students’ progress towards course learning objectives or to assess the efficacy of instructional interventions.
We describe the development process and early results for multiple-choice learning assessments (MCLAs), drawn from a mix of pre-existing instruments and original content, for four 300-level mechanical engineering courses taken in the junior year: Fluid Mechanics, Mechanics of Materials, System Dynamics, and Mechatronics. We conducted a series of 16 “think-aloud” interviews with undergraduate students with a range of prior experience with each subject. Think-aloud interviews provide rich, qualitative data about student thought processes which is not available from multiple-choice or even free-response data. Students use a variety of problem-solving strategies including application of memorized formulas, identification with personal experience, mental simulation, strategic elimination, and reverse psychology of the presumed test author. We highlight some challenges in developing MCLAs, including naively-designed questions which admit correct answers with incorrect reasoning, and schematic diagrams which may unintentionally cue irrelevant concepts.
Three of the assessments were delivered as low-stakes quizzes in large-enrollment, lecture-based courses at a private R1 university. Results from the large sample show that many of the student-held misconceptions identified during the interviews are widely held even after instruction. Assessment scores are moderately correlated with final exam scores (0.30 < r < 0.62) and course grades (0.33 < r < 0.49). Data from the pilot tests suggest possible further enhancements to the assessments.
Ford, M. J., & Ritz, H., & Finio, B., & Fisher, E. M. (2020, June), Multiple-choice Learning Assessments for Intermediate Mechanical Engineering Courses: Insights from Think-aloud Interviews Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--34990
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015