June 14, 2015
June 14, 2015
June 17, 2015
Computers in Education
26.1302.1 - 26.1302.15
Randomized Exams for Large STEM Courses Spread via Communities of PracticeWe present a software system to generate, grade, and analyze individualized per-student-randomized exams. The objectives of this system are to: (1) scale exams efficiently to very large class sizes (approaching 1000 students), and (2) improve the integrity of the exam process. To achieve these objectives, we implemented software in Python and LaTeX that generates unique per-‐student multiple-‐choice exam PDFs identified by unique error-‐correcting codes. These computer-‐generated exams are randomized from a tagged LaTeX source document that contains multiple variants of each question. The randomization includes randomized question variants, random question order (with constraints), and random answer order. The students take the exams on paper, coding their answers on paper Scantron sheets, which are then rescanned for import back into the de-‐randomization Python grading system. The grading software produces both student scores and individualized student feedback, as well as summary statistics and analyses of the exam and questions. The results of implementing this new computer-‐based randomized exam system include a dramatic reduction in student complaints about grading, an order-‐of-‐magnitude reduction in time-‐to-‐feedback, and improved instructor experience. We present detailed results including: (1) a comparison of multiple choice exams to the free response form previously used, focusing on question discrimination and predictive value, (2) students’ perceptions of exam fairness and exam-‐taking experience, and (3) faculty perceptions and experiences. In addition to the randomized exam technology itself, we also analyze the spread of this technology from the source in Calculus 2 (Eng) during Fall 2012 to nine other large STEM courses in four departments by Fall 2014. We identify two key factors in this spread: (1) the use of Communities of Practice (CoPs) as “concentrators”, and (2) the embedding of faculty in cross-‐department teaching roles. Calc 2 Eng Calc 2 non-Eng CS1 CS CoP Comp Arch Dynamics TAM CoP MatSE Mech CS1 non-major Statics Solids MatSE CoP Thermal & Mech Figure 1: Spread of randomized exam technology from the source in Math 231E (top left) to nine other courses, via three departmental Communities of Practice (CoPs).
West, M., & Silva , M., & Herman, G. L. (2015, June), Randomized Exams for Large STEM Courses Spread via Communities of Practice Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.24639
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015