Asee peer logo

The Statistics Concepts Inventory: Developing A Valid And Reliable Instrument

Download Paper |


2004 Annual Conference


Salt Lake City, Utah

Publication Date

June 20, 2004

Start Date

June 20, 2004

End Date

June 23, 2004



Conference Session

Assessment Issues I

Page Count


Page Numbers

9.1292.1 - 9.1292.15



Permanent URL

Download Count


Request a correction

Paper Authors

author page

Kirk Allen

author page

Teri Reed Rhoads

author page

Teri Murphy

author page

Andrea Stone

Download Paper |

NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Session 3230

The Statistics Concepts Inventory: Developing a Valid and Reliable Instrument

Kirk Allen1, Andrea Stone2, Teri Reed Rhoads1, Teri J. Murphy2 University of Oklahoma School of Industrial Engineering1 Department of Mathematics2


The Statistics Concepts Inventory (SCI) is currently under development at the University of Oklahoma. This paper documents the early stages of assessing the validity, reliability, and discriminatory power of a cognitive assessment instrument for statistics. The evolution of test items on the basis of validity, reliability, and discrimination is included. The instrument has been validated on the basis of content validity through the use of focus groups and faculty surveys. Concurrent validity is measured by correlating SCI scores with course grades. The SCI currently attains concurrent validity for Engineering Statistics courses, but fails to do so for Mathematics Statistics courses. Because the SCI is targeted at Engineering departments, this is a good starting point, but the researchers hope to improve the instrument so that it has applicability across disciplines. The test is shown to be reliable in terms of coefficient alpha for most populations. This paper also describes how specific questions have changed as a result of answer distribution analysis, reliability, discrimination, and focus group comments. Four questions are analyzed in detail: 1) one that was thrown out, 2) one that underwent major revisions, 3) one that required only minor changes, and 4) one that required no changes.


The concept inventory movement was spurred by the development and successful implementation of the Force Concept Inventory1,2. The FCI was developed as a pre-post test to identify student misconceptions when entering a course and check for gains upon completing the course. After many rounds of testing, it was discovered that students gain the most conceptual knowledge in interactive engagement courses, as opposed to traditional lectures3.

The success of the FCI prompted researchers to develop instruments in other fields. In light of recent ABET accreditation standards which focus on outcomes rather than simply fulfilling seat time requirements, many engineering fields have begun to development concept inventories4.

The pilot study of the Statistics Concept Inventory (SCI), conducted in Fall 2002, contained 32 questions and examined differences in scores due to gender and academic discipline5. The study

Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition Copyright © 2004, American Society for Engineering Education

Allen, K., & Rhoads, T. R., & Murphy, T., & Stone, A. (2004, June), The Statistics Concepts Inventory: Developing A Valid And Reliable Instrument Paper presented at 2004 Annual Conference, Salt Lake City, Utah. 10.18260/1-2--13652

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2004 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015