June 23, 2013
June 23, 2013
June 26, 2013
Educational Research and Methods
23.1352.1 - 23.1352.13
Validating the Diagnostic Capability of the Concept Assessment Tool for Statics (CATS) with Student Think-AloudsEngineering Concept Inventories have the potential to be used as diagnostic instruments that canprovide instructors with information about student understanding of key concepts that in turn canbe used to guide classroom instruction and improve student learning. Doing so requires evidenceof the validity of assumptions regarding the concepts and errors assessed by individual itemstogether with techniques for extracting diagnostic information.In developing CATS, Steif1 drew upon his own and others previous research regarding keyconcepts and common misconceptions that students demonstrate in reasoning about staticsproblems. Both essential conceptual knowledge and common errors have been mapped toindividual CATS items based on experts' predictions of how students' may reason about a givenproblem situation. Prior multivariate psychometric analysis of inventory and item performancehave supported these mappings, including work that proposed a matrix of cognitive attributesapplicable to each CATS item2.This paper describes the results from a verbal protocol study eliciting students' reasoning aboutkey concepts ostensibly required to solve 14 CATS items with the goal of amassing evidence toextend the instructional value of CATS. The research questions guiding this study were: • How is students' thinking about key concepts and skills in statics represented in verbal descriptions of their reasoning while solving CATS items? • How does students' thinking align with the presumed set of skills and errors underlying the design and proposed interpretation of CATS items?This study was guided by related work of Minstrell3-4 and his colleagues on facets ofunderstanding, in which the diagnosis of students' conceptual understanding may help instructorsdesign more targeted and meaningful instruction. Qualitative research methods, specificallyverbal think-aloud protocols, were identified to further validate proposed models of cognitiveskills and student errors. In the present study, students were prompted to explain their line ofreasoning for selected CATS items and describe why alternate responses were not selected. Thepreviously identified skills and errors were then used as part of an analytic coding scheme thatallowed for the emergence of additional concepts and errors beyond those originally posited foreach item.Based on student responses, it appears that the expert-generated model of knowledge and skillsmay be sufficient overall, although individual skills may align with specific CATS itemsdifferently than expected. Also, some evidence indicates that the common errors associated withCATS should include two additional errors related to misconceptions of moment.The findings of this study promise several broader impacts. First, they provide evidence ofstudent thinking as a means of validating the diagnostic capability of CATS. Second, theinformation provided from these studies will inform and enhance the interpretation of studentperformance on CATS. Third, some of the findings may indicate aspects of the current CATSthat may be considered for modification, including instances of CATS questions, multiple choiceoptions, concept descriptions and mappings, and common student error descriptions andmapping. Finally, an identification of trends in how students conceptualize statics problems, asprovided in this study may prove generally useful to inform statics instruction.1. Steif, P.S. and J.A. Dantzler, A statics concept inventory: Development and psychometric analysis. Journal of Engineering Education, 2005. 94(4): p. 363-371.2. Santiago-Roman, A.I., et al. The development of a Q-matrix for the concept assessment tool for statics. in 2010 ASEE Annual Conference and Exposition. 2010. Louisville, KY: American Society for Engineering Education.3. Minstrell, J., Student thinking and related assessment: Creating a facet-based learning environment, in Grading the Nation's Report Card: Research from the Evaluation of NAEP. 2000. p. 44-73.4. Minstrell, J. and E. van Zee, Using questioning to assess and foster student thinking, in Everyday assessment in the science classroom, J.M. Atkin and J.E. Coffey, Editors. 2003, National Science Teachers Association Press Arlington, VA. p. 61-73.
Denick, D., & Santiago-Román, A. I., & Pellegrino, J. W., & Streveler, R. A., & DiBello, L. V. (2013, June), Validating the Diagnostic Capability of the Concept Assessment Tool for Statics (CATS) with Student Think-Alouds Paper presented at 2013 ASEE Annual Conference & Exposition, Atlanta, Georgia. https://peer.asee.org/22737
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015