New Orleans, Louisiana
June 26, 2016
June 26, 2016
August 28, 2016
Pre-College Engineering Education Division
With the new emphasis on engineering practices and engineering design (NGSS Lead States, 2013), teachers of and researchers studying K-12 engineering need to find ways to measure students’ developing engineering skills. To efficiently measure student learning of engineering practices, there is need for a tool to capture student performances in a way that readily affords evaluation and scalability. The problem we pursue in this paper is how to accomplish this measurement. Can we evaluate individual elementary student skills with a quick pencil and paper design task? What aspects of performance does the measure capture? How do these aspects of performance match up with the intended measurement goals? Can the instrument be efficiently and reliably coded using a rubric, so that researchers and teachers can make use of it? To address this need, we developed a performance assessment. Performance assessments are a form of contextual assessment where students engage in tasks within a context that affords the use of practices of interest to the assessor (Klassen, 2006). Students are presented with three quick design challenges to choose from. Each is presented as a written thought experiment with follow-up questions. During development of the instrument, as part of our process of gathering evidence for validity, we conducted think-aloud protocols with a dozen students in the target age range (8-11) who were learning engineering, to inform design and ensure that students were interpreting the instrument as intended. To characterize quality of performance on the written assessment, we developed a rubric that focuses solely on aspects of the NGSS Engineering DCI and Practices that we expected to see, in particular (a) generating multiple possible solutions, (b) evaluating a potential solution against criteria and constraints, (c) planning an investigation, and (d) communicating information (a design plan). Using the rubric, we individually scored 1531 written assessments from 276 grades 3-5 classrooms. Participating classrooms hailed from schools in 3 states, from a variety of urban / suburban / rural contexts, with a variety of racial / ethnic demographics. In this paper, we present the instrument and coding rubric. We calculate inter-rater reliability for coders and present descriptive statistics for student scores to demonstrate the utility of the instrument for distinguishing a range of performances. To build a case for validity for use of the assessment to measure student learning of practices, we compare video of 30 students working on design challenges in their student groups, collected from 10 of the participating classrooms, to the same students’ performance on the assessment. This also informs the use and limits of utility of the written performance assessment for measuring elementary students’ engineering skills and understanding-in-use. Finally, we describe the time needed to score the assessments, and discuss its utility for larger-scale research studies.
Klassen, S. (2006). Contextual assessment in science education: Background, issues, and policy. Science Education, 90(5), 820–851. NGSS Lead States. (2013). Next Generation Science Standards: For States, By States. Washington, DC: The National Academies Press.
Lachapelle, C. P., & Cunningham, C. M. (2016, June), Performance Assessment in Elementary Engineering: Evaluating Student (RTP) Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.25884
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015