Asee peer logo

Randomized Exams for Large STEM Courses Spread via Communities of Practice

Download Paper |


2015 ASEE Annual Conference & Exposition


Seattle, Washington

Publication Date

June 14, 2015

Start Date

June 14, 2015

End Date

June 17, 2015





Conference Session

Best of Computers in Education

Tagged Division

Computers in Education

Tagged Topic


Page Count


Page Numbers

26.1302.1 - 26.1302.15



Permanent URL

Download Count


Request a correction

Paper Authors


Matthew West University of Illinois, Urbana-Champaign Orcid 16x16

visit author page

Matthew West is an Associate Professor in the Department of Mechanical Science and Engineering at the University of Illinois at Urbana-Champaign. Prior to joining Illinois he was on the faculties of the Department of Aeronautics and Astronautics at Stanford University and the Department of Mathematics at the University of California, Davis. Prof. West holds a Ph.D. in Control and Dynamical Systems from the California Institute of Technology and a B.Sc. in Pure and Applied Mathematics from the University of Western Australia. His research is in the field of scientific computing and numerical analysis, where he works on computational algorithms for simulating complex stochastic systems such as atmospheric aerosols and feedback control. Prof. West is the recipient of the NSF CAREER award and is a University of Illinois Distinguished Teacher-Scholar and College of Engineering Education Innovation Fellow.

visit author page

author page

Mariana Silva University of Illinois at Urbana-Champaign


Geoffrey L. Herman University of Illinois, Urbana-Champaign Orcid 16x16

visit author page

Dr. Geoffrey L. Herman is a visiting assistant professor with the Illinois Foundry for Innovation in Engineering Education at the University of Illinois at Urbana-Champaign and a research assistant professor with the Department of Curriculum & Instruction. He earned his Ph.D. in Electrical and Computer Engineering from the University of Illinois at Urbana-Champaign as a Mavis Future Faculty Fellow and conducted postdoctoral research with Ruth Streveler in the School of Engineering Education at Purdue University. His research interests include creating systems for sustainable improvement in engineering education, promoting intrinsic motivation in the classroom, conceptual change and development in engineering students, and change in faculty beliefs about teaching and learning. He serves as the webmaster for the ASEE Educational Research and Methods Division.

visit author page

Download Paper |


Randomized Exams for Large STEM Courses Spread via Communities of PracticeWe present a software system to generate, grade, and analyze individualized per-student-randomized exams. The objectives of this system are to: (1)  scale  exams  efficiently  to  very  large  class  sizes  (approaching  1000  students),  and  (2)  improve  the  integrity  of  the  exam  process.    To  achieve  these  objectives,  we  implemented  software  in  Python  and  LaTeX  that  generates  unique  per-­‐student  multiple-­‐choice  exam  PDFs  identified  by  unique  error-­‐correcting  codes.  These  computer-­‐generated  exams  are  randomized  from  a  tagged  LaTeX  source  document  that  contains  multiple  variants  of  each  question.  The  randomization  includes  randomized  question  variants,  random  question  order  (with  constraints),  and  random  answer  order.  The  students  take  the  exams  on  paper,  coding  their  answers  on  paper  Scantron  sheets,  which  are  then  rescanned  for  import  back  into  the  de-­‐randomization  Python  grading  system.  The  grading  software  produces  both  student  scores  and  individualized  student  feedback,  as  well  as  summary  statistics  and  analyses  of  the  exam  and  questions.    The  results  of  implementing  this  new  computer-­‐based  randomized  exam  system  include  a  dramatic  reduction  in  student  complaints  about  grading,  an  order-­‐of-­‐magnitude  reduction  in  time-­‐to-­‐feedback,  and  improved  instructor  experience.  We  present  detailed  results  including:  (1)  a  comparison  of  multiple  choice  exams  to  the  free  response  form  previously  used,  focusing  on  question  discrimination  and  predictive  value,  (2)  students’  perceptions  of  exam  fairness  and  exam-­‐taking  experience,  and  (3)  faculty  perceptions  and  experiences.    In  addition  to  the  randomized  exam  technology  itself,  we  also  analyze  the  spread  of  this  technology  from  the  source  in  Calculus  2  (Eng)  during  Fall  2012  to  nine  other  large  STEM  courses  in  four  departments  by  Fall  2014.  We  identify  two  key  factors  in  this  spread:  (1)  the  use  of  Communities  of  Practice  (CoPs)  as  “concentrators”,  and  (2)  the  embedding  of  faculty  in  cross-­‐department  teaching  roles.     Calc 2 Eng Calc 2 non-Eng CS1 CS CoP Comp Arch Dynamics TAM CoP MatSE Mech CS1 non-major Statics Solids MatSE CoP Thermal & Mech Figure 1: Spread of randomized exam technology from the source in Math 231E (top left) to nine other courses, via three departmental Communities of Practice (CoPs).

West, M., & Silva , M., & Herman, G. L. (2015, June), Randomized Exams for Large STEM Courses Spread via Communities of Practice Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.24639

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015