Asee peer logo

Continuous Improvement of a Concept Inventory: Using Evidence-Centered Design to Refine the Thermal and Transport Concept Inventory

Download Paper |

Conference

2015 ASEE Annual Conference & Exposition

Location

Seattle, Washington

Publication Date

June 14, 2015

Start Date

June 14, 2015

End Date

June 17, 2015

ISBN

978-0-692-50180-1

ISSN

2153-5965

Conference Session

Concept Inventories and Assessment of Knowledge

Tagged Division

Educational Research and Methods

Page Count

13

Page Numbers

26.404.1 - 26.404.13

DOI

10.18260/p.23743

Permanent URL

https://peer.asee.org/23743

Download Count

675

Paper Authors

biography

Brian Douglas Gane University of Illinois at Chicago

visit author page

Dr. Brian Gane is a Visiting Research Assistant Professor at the Learning Sciences Research Institute, University of Illinois at Chicago. Dr. Gane's research focuses on psychological theories of learning and skill acquisition, assessment, instructional design, and STEM education.

visit author page

author page

Dana Denick National Science Foundation

author page

Natalie Jorion

author page

Louis V DiBello

author page

James W Pellegrino University of Illinois, Chicago

author page

Ruth A. Streveler Purdue University, West Lafayette

Download Paper |

Abstract

Continuous Improvement of a Concept Inventory: Using Evidence Centered Design to Refine the Thermal Transport Concept InventoryConcept inventories (CIs) are increasingly being developed and used in engineering courses toassess student learning and understanding and to evaluate instructional practices. CIs often havesubstantial research underlying their development. Nevertheless, validating an assessmentinvolves explicating the proposed uses and interpretation of test scores and marshaling evidenceto support the acceptability and plausibility of these claims. As part of explicating and testingvalidity arguments for the Thermal Transport Concept Inventory (TTCI), we realized that theinstrument might be improved to better support the claims we wanted to make about test scoremeaning and use.In this paper we explain our process of iteratively redesigning a CI in which we simultaneouslymodified the domain model, items, and instrument. We use examples from the TTCI to illustratethese changes, including changes to the domain model and examples of how item testing led tosuccessive refinement of the items. In doing so, we argue that (1) adopting an explicit assessmentdesign framework facilitates later validity arguments, (2) multiple rounds of testing and designare required, and (3) a “Q-matrix” is useful to track the mappings between item response optionsand concepts/misconceptions.To help with our redesign, and to ensure that it enabled us to make our intended assessmentarguments, we used an explicit design process. We adopted Evidence Centered Design whichinvolved constructing a domain model and task design patterns. Item development was based onthese design patterns and they enable explicit mappings between item features and the domainmodel.We argue that effective redesign is iterative and involves multiple activities, many of whichoperate in parallel and feed back into one another. These activities involve multiple levels(domain, instrument, item) and when done in tandem successively improve the cohesiveness andcompleteness of our interpretation/use argument. For instance, to investigate student reasoningand how it aligns with hypothesized student thinking we conducted three think-aloud studies(each focused on 5-6 TTCI items). To investigate how the items function and whether theinstrument as a whole can support hypothesized student domain reasoning we conducted twoquantitative analyses. Throughout this redesign effort we worked from a domain model thatevolved through three distinct versions. The second version expanded to include more domainconcepts and misconceptions and the third version contracted these into a few central ideas(echoing calls for curriculum to be deep rather than wide).These redesign efforts have yielded a final version of the TTCI built on a solid design frameworkand domain model. In future work we will conduct quantitative analyses of large-scaleperformance data on the full instrument. These analyses will allow us to evaluate a multi-facetedvalidity argument, which will help researchers and instructors understand how to use andinterpret TTCI scores as part of improving aspects of engineering instruction. ReferencesKane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50, 1–73.Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assessments. Measurement: Interdisciplinary Research and Perspectives, 1, 3–67.National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. Pellegrino, J., Chudowsky, N., and Glaser, R. (Eds.). Board on Testing and Assessment, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.National Research Council. (2012). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Committee on the Status, Contributions, and Future Directions of Discipline-Based Education Research. Singer, S.R., Nielsen, N.R., & Schweingruber, H.A. (Eds.). Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.  

Gane, B. D., & Denick, D., & Jorion, N., & DiBello, L. V., & Pellegrino, J. W., & Streveler, R. A. (2015, June), Continuous Improvement of a Concept Inventory: Using Evidence-Centered Design to Refine the Thermal and Transport Concept Inventory Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.23743

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015