Asee peer logo

Assessing the Impact of Open-Resource Access on Student Performance in Computer-Based Examinations

Download Paper |

Conference

2024 ASEE Annual Conference & Exposition

Location

Portland, Oregon

Publication Date

June 23, 2024

Start Date

June 23, 2024

End Date

July 12, 2024

Conference Session

Computer-Supported Pedagogy and Assessment

Tagged Division

Computers in Education Division (COED)

Permanent URL

https://peer.asee.org/46619

Request a correction

Paper Authors

biography

Zulal Sevkli Miami University

visit author page

Dr. Zulal Sevkli has been serving as an Associate Teaching Professor in Department of Computer Science and Software Engineering at Miami University since 2021. Dr. Sevkli's professional focus lies in evidence-based computer science education, as well as the application of bio-inspired metaheuristics and machine learning algorithms to develop decision support systems. Dr. Sevkli earned her Ph.D. in Computer Engineering from Gebze Institute of Technology in 2010. She has taught a wide range of courses across the computer science curriculum and supervised undergraduate and graduate research.

visit author page

Download Paper |

Abstract

This study explores the design, implementation, and evaluation of computer-based examination methods within the context of the System Programming course. The System Programming course is designed for students who have already completed prerequisite two programming courses. As the foundational course for system-based courses, its goal is to acquaint students with the underlying processes of computer systems.

Computer-based exams were introduced to the course in the Spring of 2022. Feedback from the course evaluation survey prompted the formulation of two research questions: 1. Is resource restriction necessary during computer-based exams? 2. If we allow students to access online resources during the exam, is there a link between students' scores and their frequency of resource access during exams? To find the answer to these questions, two computer-based exam types were introduced in Fall 2022 (next semester): - First Type (Closed Digital Resources): The exam has two parts: multiple-choice/true-false and short answers/C++ solutions. Students could only move forward in the first part and both ways in the second. Only handwritten notes are allowed. - Second Type (Open Digital Resources): The exam has the same structure as the closed digital resources exam format. Additionally, students could access specific online resources, including the online textbook, online lecture notes, lab codes, and two C++ reference sites during the exam. To ensure compliance with allowed resources, a proctoring software embedded in the test platform records students' network traffic and screens and generates post-exam reports for violations.

In the Fall of 2022, seven sections of system programming courses administered computer-based exams. Among these sections, four allowed handwritten notes in their four closed digital resource exams, while the remaining three sections conducted four open digital resource exams.

Both types of computer-based exams are conducted in the classroom. Students used their own devices. They were responsible for installing or keeping up-to-date necessary software (Chrome browser, proctoring plug-in to browser, authentication for test-platform and wireless connection) and bringing hardware utilities such as charge adaptor.

To analyze the exam results, we computed the average, standard deviation, and median values for two distinct types of exams. The results of exams where open resources were allowed indicate that the median score of three sections is higher than the average, showing that most students scored above the average. In contrast, the closed-resource exam exhibited nearly identical average and median values of four sections. Still, both were lower than those of the open-resource exam, suggesting better performance in the open-resource setting.

Regarding the correlation between exam scores (0-50) and open resource access frequency, the first exam showed a moderate negative correlation (-0.38): lower scores correlated with more frequent resource access. Subsequent exams (Exam 2, 3, 4) had weak negative correlations (-0.12, -0.03, -0.18). The variation might stem from changing exam topics for four exams, with students accessing resources more as subjects became unfamiliar.

The full paper will be structured as follows: Section 1 introduces the purpose of the study by summarizing the related work on the computer-based exams. Section 2 outlines the system programming course content and its assessments. Section 3 explains exam design methods for System Programming and lists the research questions. The study findings are detailed in Section 4, while Section 5 concludes the paper.

By presenting detailed analyses, statistical comparisons, and insights from student feedback, we aim to contribute to the broader discussion on adapting computer-based examination methods in computer science education.

Sevkli, Z. (2024, June), Assessing the Impact of Open-Resource Access on Student Performance in Computer-Based Examinations Paper presented at 2024 ASEE Annual Conference & Exposition, Portland, Oregon. https://peer.asee.org/46619

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2024 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015