Asee peer logo

Comparing Student Performance on Computer-Based vs. Paper-Based Tests in a First-Year Engineering Course

Download Paper |


2014 ASEE Annual Conference & Exposition


Indianapolis, Indiana

Publication Date

June 15, 2014

Start Date

June 15, 2014

End Date

June 18, 2014



Conference Session

Computing in the First Year

Tagged Division

Computers in Education

Page Count


Page Numbers

24.297.1 - 24.297.14



Permanent URL

Download Count


Request a correction

Paper Authors


Meagan Eleanor Ita Ohio State University

visit author page

Meagan Ita is a master's student in the biomedical engineering graduate program at Ohio State University, graduating in May, 2014. She received her B.S. in biomedical engineering in the spring of 2013 from Ohio State, where she works as a graduate research associate in the Injury Biomechanics Research Center and as a graduate teaching associate with the Fundamentals of Engineering with Honors program in the Engineering Education Innovation Center (EEIC). This is her fourth year as a teaching assistant in the EEIC, and she is interested in investigating first-year experiences in engineering and optimizing the learning experience for these students. Next year, she will continue on with her Ph.D. in biomedical engineering.

visit author page


Krista M. Kecskemety Ohio State University

visit author page

Krista Kecskemety is a lecturer in the Engineering Education Innovation Center at Ohio State University. Krista received her B.S. in aerospace engineering in 2006 and her M.S. in 2007, both from Ohio State. In 2012, Krista completed her Ph.D. in aerospace engineering at Ohio State. Her engineering education research interests include investigating first-year engineering student experiences, faculty experiences, and the connection between the two.

visit author page


Katlyn Elizabeth Ashley Ohio State University

visit author page

Katlyn Ashley is a student currently pursuing a B.S. in chemical engineering at Ohio State University. Katlyn is also an undergraduate teaching assistant in the Engineering Education Innovation Center at Ohio State, which prompted her research interests in first-year engineering education.

visit author page

author page

Brooke Morin Ohio State University

Download Paper |


Comparing Student Performance on Computer-Based vs. Paper- Based Tests in a First-Year Engineering CourseComputer-based examinations are increasingly common at universities, as well as in other areas,such as government-related examinations and standardized tests. Computer-based examinationsallow for automated grading, thereby decreasing the workload of a university’s instructionalstaff. The increase in computer examinations also corresponds with an increase in onlinetextbooks, course content databases, and homework/other non-timed online assessments. Studieson student performance comparing paper-based versus computer-based test modes, however,show conflicting results. Some report that students perform better on paper vs. computer, somereport superior performance on computer-based, and still others report no difference between testmodes. The rapid advance of technology and its incorporation into students’ lives at earlier agescertainly plays a role in how students may approach a paper-based versus a computer-based test.Therefore, a first-year engineering program at a large Midwestern university conducted a studyto examine the test mode effect for two midterm exams and one final exam. This study seeks toaddress the following research question: Is there a difference in student performance betweencomputer and paper based exams, and, if so, what factors contribute to any differences?Approximately 360 students participated in this study. The portion of each exam from whichgrades were collected for data was split into two parts, Part 1 and Part 2, which togethercomprised 40-50% of each total exam grade. The question types for this portion of the examincluded multiple-choice, multiple-select, true or false, and fill in the blank. In each class, halfof the students completed Part 1 on paper and Part 2 on the computer, while the other half ofstudents did the reverse. Question phrasing and order on the same part (1 or 2) were identical,regardless of testing mode. The paper portions were created to be aesthetically similar to thecomputer version. The computer-based questions were completed on a course managementsystem. Students were familiar with completing untimed quizzes in this environment prior to theexams; however, before the first midterm examination they had never experienced the quizenvironment in a timed manner. On the computer portion of the exam, students were able tofreely navigate between questions to allow the same “flip-back” opportunity available withpaper-based questions.As of abstract submission, preliminary data from the first midterm suggest there is not astatistically significant difference between overall test scores on the computer and paper exams.There are certain questions, however, where students scored higher on the paper exam. Furtheranalyses will be conducted after the second midterm and final exam to investigate if certainquestion types lend themselves to better performance in one mode over the other. Additionally,questions that require the students to refer to an image will also be analyzed for any differencesbetween the test modes.The results of this study will help determine the impact that increased prevalence of computer-based examinations might have on student performance in a first-year engineering course. It isour hope that educators can consider this information when creating exams in order to maximizestudent success.

Ita, M. E., & Kecskemety, K. M., & Ashley, K. E., & Morin, B. (2014, June), Comparing Student Performance on Computer-Based vs. Paper-Based Tests in a First-Year Engineering Course Paper presented at 2014 ASEE Annual Conference & Exposition, Indianapolis, Indiana. 10.18260/1-2--20188

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2014 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015