Asee peer logo

Getting More From Your Data: Application Of Item Response Theory To The Statistics Concept Inventory

Download Paper |

Conference

2007 Annual Conference & Exposition

Location

Honolulu, Hawaii

Publication Date

June 24, 2007

Start Date

June 24, 2007

End Date

June 27, 2007

ISSN

2153-5965

Conference Session

Assessment and Evaluation in Engineering Education II

Tagged Division

Educational Research and Methods

Page Count

20

Page Numbers

12.782.1 - 12.782.20

Permanent URL

https://peer.asee.org/2465

Download Count

283

Request a correction

Paper Authors

biography

Kirk Allen Purdue University

visit author page

Kirk Allen is a post-doctoral researcher in Purdue University's Department of Engineering Education. His dissertation research at The University of Oklahoma was the development and analysis of the Statistics Concept Inventory (NSF DUE 0206977), which combines his interest in statistics and assessment methodologies.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Getting More from your Data: Application of Item Response Theory to the Statistics Concept Inventory

Abstract

This paper applies the techniques of Item Response Theory (IRT) to data from the Statistics Concept Inventory (SCI). Based on results from the Fall 2005 post-test (n = 422 students), the analyses of IRT are compared with those of Classical Test Theory (CTT). The concepts are extended to discussions of other applications, such as computerized adaptive testing and Likert- scale items which may be of interest to the engineering education community.

While techniques based on CTT generally yield valuable information, methods of IRT can reveal unanticipated subtleties in a dataset. For example, items of extreme difficulty (hard or easy) typically attain low discrimination indices (CTT), thus labeling them as “poor”. Application of IRT can identify these items as strongly discriminating among students of extreme ability (high or low). The three simplest IRT models (one-, two-, and three-parameter) are compared to illustrate cases where they differ. The theoretical foundations of IRT are provided, extending to validating the assumptions for the SCI dataset and discussing other potential uses of IRT that are applicable to survey design in engineering education.

Introduction

The Steering Committee of the National Engineering Education Research Colloquies1 identified assessment (“Research on, and the development of, assessment methods, instruments, and metrics to inform engineering education practice and learning”) as one of five areas that form the foundation of engineering education research. Further, there are calls for engineering education to become both more inter-disciplinary2 and rigorous3.

Item Response Theory4,5,6,7 (IRT) is commonly used by psychologists in survey design and analysis; effectively “learning the language” opens a conduit for collaboration and dissemination. IRT is therefore useful to place in the engineering education researcher’s toolbelt. The mathematical advantages of IRT enhance rigor with procedures to track characteristics of both the test and examinees across such variables as time, gender, major, etc.

This article first describes item response theory from a theoretical perspective, describing the common models for dichotomous (those scored ‘correct’ or ‘incorrect’) items and their assumptions. Secondly, interpretation of IRT is presented by application to the Statistics Concept Inventory (SCI). Finally, some extensions of IRT are described which may be of interest to the engineering educator.

Allen, K. (2007, June), Getting More From Your Data: Application Of Item Response Theory To The Statistics Concept Inventory Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. https://peer.asee.org/2465

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015