June 18, 2006
June 18, 2006
June 21, 2006
11.641.1 - 11.641.9
Feeding Back Results from a Statics Concept Inventory to Improve Instruction
Effective assessment is known to be critical to improving learning outcomes1,2. For many engineering subjects, one hopes students will learn to transfer their newly gained knowledge to new situations, which then requires a deep understanding of the material3. This has been taken to mean conceptual understanding. One approach to assessing conceptual understanding, with its origins in the science education community, is the Force Concept Inventory4. The approach of concept inventories has been extended by the engineering education community to a variety of engineering subjects5. If such concept inventories can provide formative assessment - feedback to improve the learning of students who take them - then their usefulness will be further enhanced.
In the present paper we consider the use of the Statics Concept Inventory (SCI) as a basis for formative assessment. This test has been reported on previously6,7. We must demonstrate that the results of this inventory are meaningful and worthwhile to feed back, and that we can devise effective formats for presenting these results. Accordingly, we report briefly on indicators of the quality of the test results. Next, we show how results can be presented to help instructors compare their students’ performance with those of other institutions and to help students compare their performance with those of peers in their class. Finally, we present survey data from students who attended a review session addressing SCI questions prior to the class final exam.
Background on Statics Concept Inventory and Current Status
The SCI consists of 27 multiple choice questions; each question addresses a single concept, involves negligible calculation, and features wrong answers that capture typical conceptual errors made by students. The conceptual framework and typical errors were based on field studies of students’ work as described elsewhere8. Psychometric analyses of the test results as a whole and its individual items have been conducted. Such analyses, which draw on statistical methods, are aimed at judging whether the test yields measurements of knowledge or skills that are reliable and valid. Based on such psychometric analyses, the test has steadily improved each year. The numbers of students taking the test has also increased: 245 in 2003- 2004, 1330 in 2004-2005, and 1255 as of the first half of 2005-2006 (with 16 classes participating). Only very minor changes to the current version are anticipated for the future.
A critical feature of the test is that questions are grouped according to concept. In the 2005- 2006 version of the test, there are 9 concepts, with 3 questions per concept. The concepts are given in Table 1.
Steif, P., & Hansen, M. (2006, June), Feeding Back Results From A Statics Concept Inventory To Improve Instruction Paper presented at 2006 Annual Conference & Exposition, Chicago, Illinois. https://peer.asee.org/130
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2006 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015