Asee peer logo

Computerized Testing: A Vision and Initial Experiences

Download Paper |

Conference

2015 ASEE Annual Conference & Exposition

Location

Seattle, Washington

Publication Date

June 14, 2015

Start Date

June 14, 2015

End Date

June 17, 2015

ISBN

978-0-692-50180-1

ISSN

2153-5965

Conference Session

Computer-Based Tests, Problems, and Other Instructional Materials

Tagged Division

Computers in Education

Page Count

13

Page Numbers

26.387.1 - 26.387.13

DOI

10.18260/p.23726

Permanent URL

https://peer.asee.org/23726

Download Count

54

Request a correction

Paper Authors

biography

Craig Zilles University of Illinois, Urbana-Champaign

visit author page

Craig Zilles is an Associate Professor in the Computer Science department at the University of Illinois at Urbana-Champaign. His current research focuses on computer science education and computer architecture. His research has been recognized by two best paper awards from ASPLOS (2010 and 2013) and by selection for inclusion in the IEEE Micro Top Picks from the 2007 Computer Architecture Conferences. He received the IEEE Education Society's Mac Van Valkenburg Early Career Teaching Award in 2010, a (campus-wise) Illinois Student Senate Teaching Excellence award in 2013, the NSF CAREER award, and the Univerisity of Illinois College of Engineering's Rose Award and Everitt Award for Teaching Excellence. Prior to his work on education and computer architecture, he developed the first algorithm that allowed rendering arbitrary three-dimensional polygonal shapes for haptic interfaces (force-feedback human-computer interfaces). He holds 6 patents.

visit author page

author page

Robert Timothy Deloatch University of Illinois, Urbana Champaign

biography

Jacob Bailey University of Illinois

visit author page

Jacob Bailey is currently a sophomore studying computer science at the University of Illinois.

visit author page

author page

Bhuwan B. Khattar

biography

Wade Fagen University of Illinois, Urbana-Champaign

visit author page

Dr. Wade Fagen is a Lecturer in the Department of Computer Science in the College of Engineering at The University of Illinois at Urbana-Champaign (UIUC). He teaches one of UIUC's largest courses, Introduction to Computer Science, known as CS 105. His research aims to improve learning by using technologies that students already bring to the classroom.

visit author page

biography

Cinda Heeren University of Illinois, Urbana-Champaign

visit author page

Dr. Cinda Heeren is an award-winning Senior Lecturer at the University of Illinois, Urbana-Champaign. She teaches CS225, Data Structures and Programming Principles, to hundreds of enthusiastic and talented undergraduates every year. She is always game to try new pedagogical innovations, and she loves telling young women about her affection for computing.

visit author page

author page

David Mussulman Engineering IT Shared Services, University of Illinois, Urbana-Champaign

biography

Matthew West University of Illinois, Urbana-Champaign

visit author page

Matthew West is an Associate Professor in the Department of Mechanical Science and Engineering at the University of Illinois at Urbana-Champaign. Prior to joining Illinois he was on the faculties of the Department of Aeronautics and Astronautics at Stanford University and the Department of Mathematics at the University of California, Davis. Prof. West holds a Ph.D. in Control and Dynamical Systems from the California Institute of Technology and a B.Sc. in Pure and Applied Mathematics from the University of Western Australia. His research is in the field of scientific computing and numerical analysis, where he works on computational algorithms for simulating complex stochastic systems such as atmospheric aerosols and feedback control. Prof. West is the recipient of the NSF CAREER award and is a University of Illinois Distinguished Teacher-Scholar and College of Engineering Education Innovation Fellow.

visit author page

Download Paper |

Abstract

Computerized Testing: A Vision and Initial Experiences In a large (200+ students) class, running exams is a logistical nightmare. Such examsrequire conflict exams and figuring out how to address the full range of Bloom’s taxonomylearning goals in a manner that can be efficiently graded to give quick student feedback.Typically, these exam hassles lead instructors to have a few, large, multiple-choice intenseexams, which can be suboptimal for student learning. In this paper, we pursue a different vision, enabled by making a computer a central part ofthe testing process. We envision a computerized testing center, proctored 60-80 hours/week.When a course assigns a (computerized) exam, the professor specifies a range of days for theexam and the student reserves a time of their convenience. When the student arrives, theyare ushered to a machine that has been booted into the specified exam configuration (manydifferent exams are being run in the testing center concurrently). The student logs in andis ushered through their exam. Each exam consists of a random selection of parameterizedproblems meeting coverage and difficulty criteria, so each exam is different. The networkingon the machine is configured to prevent unauthorized communication. The system displaysand controls the remaining time for the exam. We see two main advantages of this approach. First, we centralize all of the hasslesof running exams, so course staff no longer have to manage the scheduling, staffing, andpaper shuffling of running exams. As such, we drastically lower the effort of running exams,making more frequent, lower stakes testing and second chance testing practical. Second, we greatly broaden the kinds of questions that can be machine graded. Mostlarge classes rely at least partially on scantrons for automation, but many of the questionsthat we want to ask aren’t multiple choice. With a computer involved, you can ask (andauto-grade) any question that can be objectively scored; you can ask students to designcircuits, do graphical problems like drawing acceleration vectors, write code, write equations,draw force diagrams, align genetic sequences, etc. Furthermore, as modern engineering ispracticed in a heavily computer-supported environment, we can have them use industrystandard software to solve design and analysis problems. This is particularly compelling inprogramming classes, where students can compile, test, and debug their problems beforesubmitting them for grading. In this paper, we’ll describe our experiences with a prototype computerized testing laband running all of a 200-student computer organization class’s exams using computerizedtesting. We’ll discuss the mechanics of operating the testing lab, the work required by theinstructor to enable this approach (e.g., generating a diversity of equivalent difficulty prob-lems), and the student response, which has been strongly positive: 75% prefer computerizedtesting, 12% prefer traditional written exams, and 13% had no preference.

Zilles, C., & Deloatch, R. T., & Bailey, J., & Khattar, B. B., & Fagen, W., & Heeren, C., & Mussulman, D., & West, M. (2015, June), Computerized Testing: A Vision and Initial Experiences Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.23726

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015