Asee peer logo

Open Book Vs. Closed Book Testing: An Experimental Comparison

Download Paper |

Conference

2010 Annual Conference & Exposition

Location

Louisville, Kentucky

Publication Date

June 20, 2010

Start Date

June 20, 2010

End Date

June 23, 2010

ISSN

2153-5965

Conference Session

ERM Potpourri

Tagged Division

Educational Research and Methods

Page Count

11

Page Numbers

15.929.1 - 15.929.11

DOI

10.18260/1-2--16901

Permanent URL

https://peer.asee.org/16901

Download Count

11344

Request a correction

Paper Authors

biography

Leticia Anaya University of North Texas

visit author page

Leticia Anaya, M.S. is a Lecturer in the Department of Engineering Technology at the University of North Texas College of Engineering. She is currently working in her PhD in Management Science at the University of North Texas. She received her M.S. in Industrial Engineering from Texas A&M University. Her research and teaching interests include Thermal Sciences, Statistics, Quality Assurance, Machine Design, Simulation and Educational Teaching Methods. She has published previously in ASEE Conferences and has developed three laboratory manuals in the following areas: Thermal Sciences, Fluid Mechanics and Mechanics of Materials.

visit author page

biography

Nicholas Evangelopoulos University of North Texas

visit author page

Nicholas Evangelopoulos, PhD is an Associate Professor of Decision Sciences at the University of North Texas and a Fellow of the Texas Center for Digital Knowledge. He received his Ph.D. in Decision Sciences from Washington State University and his M.S. in Computer Science from the University of Kansas. His research interests include Statistics and Text Mining. His publications include articles appearing in MIS Quarterly, Communications in Statistics, ASCE Journal of Hydraulic Engineering, and Computational Statistics & Data Analysis.

visit author page

biography

Uyi Lawani University of North Texas

visit author page

Uyi Lawani, M.S. is a doctoral student in strategy in the Department of Management and also a Fellow of the Robert B. Toulouse School of Graduate Studies at the University of North Texas. While his doctoral minor work was in Economics, he has a B.S. degree in Microbiology and received his MBA in Finance from East Carolina University. His research interests include Organizational governance structures: mergers; acquisitions; and alliances. His solo authored refereed paper has been published in the proceedings of the Decision Science Institut Department of Management.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Open-Book vs. Closed-Book Testing: an Experimental Comparison

Abstract

This research adds to the ongoing dispute on what is the better method of assessing college students during examinations: open- book or closed- book. The open book assessment method is considered by many to be a realistic method that resembles the actual professional setting of demonstrating acquired knowledge in the field. On the other hand, the closed book assessment method has been used for centuries in traditional institutions as a rigorous method for knowledge assessment. In this research, engineering and business students from a large university in the southwestern United States participated in an experimental comparison designed to determine whether open-book or closed-book is the better approach to access academic knowledge during examinations. The Latin Square experimental design was used to block the variation due to the order in which students received the open-book and closed book treatments, as well as differences in material content tested on the exam. The research study produced mixed results for engineering students that were tested in three different classes: Statics, Mechanics of Materials and Quality Assurance. After adjusting for material content and treatment order differences, in two out of the three classes, the engineering students attained higher scores, a possible indication of achieving a higher learning level, when they were tested in the closed book approach. For the business side, the results indicate that the students attained higher scores, indicating a possible higher learning level, using the open book approach. The implications for this research can be extended to today’s online testing and certification environments, which are typically “open-book”. The open-book nature of online testing is viewed by some as a necessary evil that poses a validity threat, and by others as a simulation of the professional environment. As a direction for future research, this study could be followed up with future experiments that will attempt to reproduce the results in an online environment.

Introduction

The traditional invigilated closed book approach for testing has been used for generations in various institutions of higher learning. But with the advent of modern technology, the open book format for testing is becoming more common. Controversy exists as to what is the best method of assessing academic learning and performance between these two approaches of testing. Each method has its critics and its supporters. Although the closed book invigilated style is the traditional format that has existed for generations, it is not necessarily problem-free. The main arguments against the closed book format is that this format is irrelevant to real life professional practice, it encourages recall type learning rather than application focused learning, it encourages cheating and it is more costly to administer1. Before presenting our research study in detail, these four arguments are first examined here.

The first argument is that the traditional invigilated closed book format is considered to be unrealistic from actual professional practice. In the engineering field, practicing engineers tend to rely on manuals, technical books, Internet and any other extraneous source to be able to solve real life engineering complex problems. Shine and his associates in their article “In Defense of Open-Book Engineering Degree Examinations” defended the open-book engineering testing

Anaya, L., & Evangelopoulos, N., & Lawani, U. (2010, June), Open Book Vs. Closed Book Testing: An Experimental Comparison Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16901

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015