Asee peer logo

Validating of the diagnostic capabilities of concept inventories: Preliminary evidence from the Concept Assessment Tool for Statics (CATS)

Download Paper |

Conference

2012 ASEE Annual Conference & Exposition

Location

San Antonio, Texas

Publication Date

June 10, 2012

Start Date

June 10, 2012

End Date

June 13, 2012

ISSN

2153-5965

Conference Session

Research in Engineering Education II

Tagged Division

Educational Research and Methods

Page Count

19

Page Numbers

25.1457.1 - 25.1457.19

DOI

10.18260/1-2--22214

Permanent URL

https://peer.asee.org/22214

Download Count

474

Paper Authors

biography

Dana Denick Purdue University, West Lafayette

visit author page

Dana Denick is a PhD Student in the School of Engineering Education at Purdue University. Dana holds a BS in Mechanical Engineering from Bucknell University, MA in Physics Education from the University of Virginia and MS in Library and Information Science from Drexel University. Her research interests are difficult concepts in engineering and information literacy for engineering.

visit author page

biography

Aidsa I. Santiago-Román University of Puerto Rico, Mayaguez Campus

visit author page

Aidsa I. Santiago-Román is an Assistant Professor in the Department of Engineering Science and Materials and the Director of the Strategic Engineering Education Development (SEED) Office at the University of Puerto Rico, Mayaguez Campus (UPRM). Dr. Santiago earned a BA (1996) and MS (2000) in Industrial Engineering from UPRM, and Ph.D. (2009) in Engineering Education from Purdue University. Her primary research interest is investigating students’ understanding of difficult concepts in engineering science with underrepresented populations. She also teaches introductory engineering courses such as Problem Solving and Computer Programming, Statics, and Mechanics.

visit author page

biography

Ruth A. Streveler Purdue University, West Lafayette

visit author page

Ruth A. Streveler is an Assistant Professor in the School of Engineering Education at Purdue University. Before coming to Purdue in 2006 she spent 12 years at Colorado School of Mines, where she was the founding Director of the Center for Engineering Education. Dr. Streveler earned a BA in Biology from Indiana University-Bloomington, MS in Zoology from the Ohio State University, and Ph.D in Educational Psychology from the University of Hawaii at Manoa. Her primary research interests are investigating students’ understanding of difficult concepts in engineering science and helping engineering faculty conduct rigorous research in engineering education.

visit author page

biography

Natalie Barrett Purdue University, West Lafayette

visit author page

Natalie Barrett is a Mechanical Engineering PhD student at Purdue University and is interested in renewable energy. Natalie received a BSME from Florida State University, a MSME from Georgia Institute of Technology, and a MBA from Indiana University. She has taught at Wentworth Institute of Technology as an Adjunct Professor. She has also worked in industry at Pratt & Whitney for several years and served in roles such as Integrated Product Team Leader and Affordability and Risk Manager for the F135 Engine Program.

visit author page

Download Paper |

Abstract

Validating of the diagnostic capabilities of concept inventories: Preliminary evidence from the Concept Assessment Tool for Statics (CATS)AbstractAssessment, specifically for the development of curricula and evaluation of students'performance with respect to ABET accreditation requirements, has been an important aspect ofengineering education. Therefore, engineering educators need to implement rigorous assessmentpractices in their courses that are both valid and reliable, in a manner that would allow them tohave the necessary evidence to improve students’ learning. Engineering concept inventories(CIs) have been developed with the intention to be used by faculty to assess students’understanding of specific concepts. Unfortunately they have been used primarily to assess theeffectiveness of instructional techniques. Furthermore, traditional psychometric techniquesapplied to CIs have shown the validity and reliability of CIs to measure specific engineeringconcepts, but these techniques have not been able to provide instructors with information thatwould help them predict students’ performance within those concepts. Diagnostic assessmenttechniques, such as the Fusion Model, provide instructors with information about students’ priorknowledge and misconceptions before beginning a learning activity. They also provide abaseline for understanding how much learning has taken place after the learning activity iscompleted.Previous research studies have applied the Fusion Model to various large-scale instruments,including CIs. Specifically, a four-phase process was conducted where the Fusion Model wasapplied to the Concept Assessment Tool for Statics (CATS). In this study a set of fourteen“skills” was identified by experts, of which mastery is needed to select a correct answer amongdistractors (common misconceptions). Results from this quantitative research methodologyindicated a high diagnostic capacity of CATS. Also, for each student, expected attributes masteryprofiles were generated for each question and “skill”. These profiles could be used by instructorsto design and implement more targeted instruction, in accordance to each student’s cognitivecapability. Qualitative research methodologies, such as think-alouds, were identified to benecessary to validate experts’ recommendations about students' point of view and demonstrationof expected "skills".This paper focuses on a pilot study that was conducted to establish a protocol to elicit students’responses to five questions from CATS whose corresponding concepts were identified to bemore problematic among engineering students. The research question that guided this study was:How do students' conceptions and misconceptions align with expected skills and mastery profilesof the Concept Assessment Tool for Statics (CATS)? This research is guided by recommendationsfrom Minstrell’s work on facets of understanding, where he argued that understanding howstudents make meaning of concepts will help teachers to design more targeted instruction.Specifically, he recommended the use of open-ended interviews, focus groups, or think-aloudstrategies to diagnose students’ misconceptions. In the current study, students were interviewedas they answered CATS’ questions. Preliminary results from five students have shown thatstudents' answering behavior may not be indicative of their conceptual thinking. It is crucial forprotocol to elicit responses from students that verbalize their thinking, so analysis does not relyon expert knowledge to assume students' understanding. The paper will describe how protocolhas been revised to address this limitation.

Denick, D., & Santiago-Román, A. I., & Streveler, R. A., & Barrett, N. (2012, June), Validating of the diagnostic capabilities of concept inventories: Preliminary evidence from the Concept Assessment Tool for Statics (CATS) Paper presented at 2012 ASEE Annual Conference & Exposition, San Antonio, Texas. 10.18260/1-2--22214

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2012 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015