Asee peer logo

Developing and Validating a Concept Inventory

Download Paper |

Conference

2015 ASEE Annual Conference & Exposition

Location

Seattle, Washington

Publication Date

June 14, 2015

Start Date

June 14, 2015

End Date

June 17, 2015

ISBN

978-0-692-50180-1

ISSN

2153-5965

Conference Session

Concept Inventories and Assessment of Knowledge

Tagged Division

Educational Research and Methods

Page Count

12

Page Numbers

26.497.1 - 26.497.12

DOI

10.18260/p.23836

Permanent URL

https://peer.asee.org/23836

Download Count

701

Request a correction

Paper Authors

biography

Natalie Jorion University of Illinois, Chicago

visit author page

Natalie Jorion is a research assistant and Ph.D. student of learning sciences specializing in psychometrics at the University of Illinois in Chicago, 1240 W. Harrison St, Chicago, IL 60607; njorio2@uic.edu.

visit author page

biography

Brian Douglas Gane University of Illinois at Chicago

visit author page

Dr. Brian Gane is a Visiting Research Assistant Professor at the Learning Sciences Research Institute, University of Illinois at Chicago. Dr. Gane's research focuses on psychological theories of learning and skill acquisition, assessment, instructional design, and STEM education.

visit author page

author page

Louis V DiBello

author page

James W Pellegrino University of Illinois, Chicago

Download Paper |

Abstract

Developing and Validating a Concept InventoryConcept inventories (CIs) have been used to assess undergraduate students’ understanding ofimportant and difficult concepts in engineering disciplines. However, research has shown thateven meticulously designed CIs often fall short of measuring student conceptual understanding.CI developers’ intentions of measuring particular understandings are sometimes not congruent tothe skills and conceptual understandings students use to interpret and respond to items inpractice. This incongruity can occur despite developers’ expertise, perhaps because of the“expert blind spot”. Even when developers are able to create items that tap into the intendedconceptual understandings, the assessments may not indicate the extent to which students havemastered particular concepts. To create an inventory that is interpretable and meaningful requiresthat developers take great care in constructing their assessment and consider their validityarguments from the outset. They need to consider the extent to which an assessment measureswhat it was intended to measure as demonstrated through properties of the assessment thatinclude examinee response patterns; such considerations are part of an evidentiary argumentprocess.This paper outlines several actions developers can take to create high-quality inventories, usingan evidentiary approach to assessment design. Additionally, it presents an analytic frameworkthat can be used to evaluate validity arguments once the assessment has been developed. We usetwo case studies to illustrate application of these analyses to evaluate a CI’s validity: theConceptual Assessment Tool for Statics (CATS) and the Dynamics Concept Inventory (DCI).For example, the developers of these CIs make claims about overall conceptual understanding ofthe domain, understanding of specific concepts, and the presence of particular misconceptions orcommon errors. Our analyses found varying degrees of support for each claim, such as the use ofthe assessment as a measure of students’ overall understanding. Only CATS was able to provideevidence regarding student understanding of specific domain concepts. Neither assessmentshowed evidence for differentiating among student misconceptions. Overall, this framework canhelp to provide structure in evaluating the validity arguments of CIs, as well as help CIdevelopers look ahead when creating inventories so that the validity claims are better aligned tostudent reasoning. ReferencesKane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50, 1-73.Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). Focus article: On the structure of educational assessments. Measurement: Interdisciplinary research and perspectives, 1(1), 3-62.Nathan, M. J., & Koedinger, K. R. (2000). An investigation of teachers' beliefs of students' algebra development. Cognition and Instruction, 18(2), 209-237.Nathan, M. J., & Petrosino, A. (2003). Expert blind spot among preservice teachers. American Educational Research Journal, 40(4), 905-928.  

Jorion, N., & Gane, B. D., & DiBello, L. V., & Pellegrino, J. W. (2015, June), Developing and Validating a Concept Inventory Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.23836

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015