Montreal, Quebec, Canada
June 22, 2025
June 22, 2025
August 15, 2025
ERM Technical Session: Improving Assessment in Engineering Education
Educational Research and Methods Division (ERM)
12
https://peer.asee.org/56303
Chinedu Emeka recently earned a PhD in Computer Science from the University of Illinois at Urbana-Champaign. His research interests include engineering education and improving assessments for STEM students. He has taught multiple computer science courses at both the undergraduate and graduate levels and has received two teaching awards in recognition of his effectiveness as an instructor.
Matthew West is an Associate Professor in the Department of Mechanical Science and Engineering at the University of Illinois at Urbana-Champaign. Prior to joining Illinois he was on the faculties of the Department of Aeronautics and Astronautics at Stanfo
Jim Sosnowski is the Assistant Director of the CBTF at the University of Illinois Urbana-Champaign. He has conducted qualitative research focused on critically evaluating educational programing with the aim of developing more equitable classroom policies and practices to enhance the student learning experience.
Dr. Geoffrey L. Herman is the Severns Teaching Professor with the School of Computing and Data Scientist at the University of Illinois at Urbana-Champaign.
Craig Zilles is a Professor in the Computer Science department at the University of Illinois at Urbana-Champaign. His research focuses on the intersection of computing and education, particularly in assessment (e.g., the Computer-based Testing Facility, second-chance testing) and on how students learn to read code.
Mariana Silva is a Teaching Associate Professor in the Siebel School of Computing and Data Science at the University of Illinois Urbana-Champaign and co-founder and CEO of PrairieLearn Inc., a company dedicated to empowering instructors with tools to enhance teaching workflows without compromising educational quality. Before joining CS@Illinois in 2017, she was a lecturer in the Department of Mechanical Science and Engineering at the same university for five years. Silva has extensive experience in course development across engineering, computer science, and mathematics and is passionate about advancing teaching innovations that benefit students and instructors alike. She is an expert in the development and application of computer-based tools for teaching and learning in large STEM university courses. Her current research investigates the use of educational technologies to enhance computer-based assessments and centralized computer-based testing centers. This includes leveraging Large Language Models (LLMs) for automated short-answer grading and the creation of robust, randomized question generators to improve equity, accessibility, and scalability in teaching, learning and testing practices.
In this full, empirical research paper, we investigated whether the use of a computer-based testing center (CBTC) impacts students’ test anxiety. Increasing student enrollment and the desire to test computational skills are leading some large universities to adopt computer-based testing centers. In a CBTC, students are able to take their exams asynchronously (i.e., at different times of their choosing) using institutional computers secured by a firewall to prevent unauthorized Internet access. We compared a CBTC setup to a second potential method of administering exams at scale for engineering students. Under the second method, students complete their tests in class synchronously (i.e., at the same time) using their own computers, which are not secured by a firewall to limit unauthorized Internet access. This method of administering exams may be referred to as Bring Your Own Device (BYOD).
We ran a crossover experiment in a large engineering course, varying the testing modality used by students for each exam. Students took three of their six exams in the CBTC and the other three exams under the BYOD format. We administered a validated instrument on test anxiety after each exam and collected data on students’ exam scores. Overall, 149 students participated in the study and completed all the surveys. At the end of the semester, we also conducted interviews with a small number of students to learn more about perceptions of the two testing environments.
The raw data results suggest that both test anxiety was lower and performance was higher in the CBTC format than the BYOD format, but the effect of exam modality might be less important than when in the exam period that the students took their BYOD exams. We explain this interaction in detail and present recommendations for computer-based examinations.
Emeka, C. A., & West, M., & Sosnowski, J., & Herman, G. L., & Zilles, C., & Silva, M. (2025, June), Do Centralized Testing Centers Influence Test Anxiety for Engineering Students? Paper presented at 2025 ASEE Annual Conference & Exposition , Montreal, Quebec, Canada . https://peer.asee.org/56303
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2025 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015