Asee peer logo

Testing The Test: Validity And Reliability Of Senior Exit Exam

Download Paper |

Conference

2010 Annual Conference & Exposition

Location

Louisville, Kentucky

Publication Date

June 20, 2010

Start Date

June 20, 2010

End Date

June 23, 2010

ISSN

2153-5965

Conference Session

IE and the Classroom

Tagged Division

Industrial Engineering

Page Count

14

Page Numbers

15.1202.1 - 15.1202.14

DOI

10.18260/1-2--16563

Permanent URL

https://peer.asee.org/16563

Download Count

601

Request a correction

Paper Authors

author page

Lizabeth Schlemer California Polytechnic State University

author page

Daniel Waldorf California Polytechnic State University

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Testing the Test: Validity and Reliability of Senior Exit Exam Lizabeth Schlemer and Daniel Waldorf California Polytechnic State University, San Luis Obispo Abstract

A senior exit exam is considered an excellent direct measure of student learning for ABET assessment, but the usefulness of the information gathered is related to the validity and reliability of the test itself. The Industrial and Manufacturing Engineering Department at California Polytechnic State University, San Luis Obispo has used a content exam for several years. This paper will discuss test development, administration, and the role it plays in the assessment process. In addition, the test is evaluated using the standard psychometric techniques of reliability and validity. The results of the evaluation are used to refine the test. The importance of the evaluation of these types of instruments cannot be overstated as they often are used to guide curricular or other program improvements efforts.

INTRODUCTION

The Accrediting Board for Engineering and Technology (ABET) 1 encourages programs to use direct measures of performance when evaluating the achievement of learning outcomes prior to student graduation. Direct measures are those that assess achievement by observation of performance rather than by soliciting opinion about the achievement of a particular outcome. A standardized exam is a good direct measure. Others might include a third party evaluation of student projects or a manager’s assessment of work done on co-op/internship. A standardized exam may be the most tempting for busy faculty trying to assess their program because it is fairly easy to administer, the results are naturally quantifiable, and the program can more or less guarantee a consistent rate of response. Such an exam, however, should be evaluated using a psychometric evaluation to study reliability, validity, and item correlation before the results are used to invest significant time and effort into improving a program.

Psychometric Evaluation

The aim of a psychometric evaluation of a test is to determine how well the instrument, in this case the test, is measuring the construct of interest, in this case the individual’s ability. In every measurement whether it is a physical measurement like a micrometer or a psychological measurement like a survey, the resultant value has some amount of error. In order to evaluate the quality of the measurement device, psychometricians use two general characteristics: Reliability and Validity. Reliability describes the consistency of a measurement, while validity addresses the appropriateness of the instrument for measuring the desired construct2. A test can be completely consistent, yet measure the wrong thing. If students take a calculus test and every time they take the test they score at a consistently high or low level, the test would be deemed reliable. Yet the test probably would not be “valid” to assess a student’s teamwork skill. This would be an example of a reliable, but invalid (for teamwork) test.

Schlemer, L., & Waldorf, D. (2010, June), Testing The Test: Validity And Reliability Of Senior Exit Exam Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16563

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015