Minneapolis, MN
August 23, 2022
June 26, 2022
June 29, 2022
29
10.18260/1-2--41301
https://peer.asee.org/41301
500
I am a lecturer at the University of Michigan. I research ways to use data-informed analysis of students' performance and perceptions of classroom environment to support DEI-based curricula improvements.
Harsh Jhaveri is a master's student at the University of Michigan, pursuing a degree in Robotics. Harsh previously has completed dual bachelor's degrees in Aerospace Engineering and Computer Science Engineering, also from the University of Michigan. In addition to his pursuing his degree, Harsh is also a graduate student instructor (GSI) for Engineering 101, Introduction to Computers and Programming, a first-semester course mandatory for all engineering students. In addition to his teaching duties, Harsh has helped facilitate and develop course logistics, course development, and professional development for staff members through the Foundational Course Initiative at the University of Michigan. Outside of teaching, Harsh enjoys developing software for autonomous aircraft systems, cooking, and collecting Vinyl LPs.
This research paper describes our analysis of how student exam scores in a large introductory programming course evaluate student learning in the context of other assessment mechanisms. Data from Academic Years 2018-2019 and 2020-2021 were used to compare the pre-pandemic individual assessment scheme with the revised scheme implemented in response to the shift to remote instruction. Our analysis focused on two key questions: 1) How well are individual assessments enabling students to demonstrate their learning? and 2) how equitable are the assessments with respect to grade outcomes for students with historically marginalized identities? Specifically, our aim was to reduce the onerous workload of exam preparation for the instructional team, while assessing student knowledge in a pedagogically effective and equitable way.
Based on the results of this development and analysis, we implemented a four-assessment structure for Fall 2021. Preliminary analysis of these assessments indicate that the new assessments are more equitable and enable earlier identification of students who may be struggling with the course material. Additionally, the new assessment infrastructure requires significantly less instructor time to maintain and implement from term to term.
In this paper, we will describe our motivation for this study, the analysis of past exam and course data, our thought process for the new assessments, and a preliminary analysis of the effectiveness and equity of the new assessments. We hope that others find our experiences and analysis useful when informing their own assessment decision making.
Alford, L., & Rypkema, H., & Hosseini, R., & Beemer, M., & Jhaveri, H. (2022, August), Turns Out Our Exams Were Pointless, So We Changed Our Assessment Strategy Paper presented at 2022 ASEE Annual Conference & Exposition, Minneapolis, MN. 10.18260/1-2--41301
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2022 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015