June 15, 2019
June 15, 2019
June 19, 2019
Pre-College Engineering Education
[Study] is a randomized, controlled study of an educational intervention for students in grades 3-5 [Treatment] where schools were randomized to teach either the [Treatment] or [Comparison] curriculum. [Treatment] is a project-based learning curriculum designed according to social constructivist learning principles; [Comparison] is designed as direct (didactic) instruction with hands-on activities.
Key to measuring the true effect of an intervention like [Treatment] is measurement of its fidelity of implementation (FOI) by teachers to improve and assure internal validity , . Social work and health fields have undertaken considerable work to define implementation fidelity , . Science education researchers have built upon this work to study the implementation of science curricula, e.g., , . In seeking to understand FOI, we rely on a framework defined by Carroll et al. , with some modification based on the work of O’Donnell . In this framework, the causal effect of an intervention on outcomes depends primarily upon FOI . FOI includes specifics of what content was addressed, frequency and duration of lessons, and student participation—the “structure” of implementation . The relationship between the intervention as intended and adherence may be modified by other factors, including the complexity of the intervention, facilitation strategies (teacher guide, PD workshops, incentives, etc.), and participant response (teacher and student attitudes and judgments of value). Carroll et al.  categorize quality of delivery by the teacher (also called an “intervention process” ), or “the way in which services are delivered” —as a potential moderator; however we follow the majority of studies in considering this an element of FOI.
Mowbray et al.  advocated that researchers identify and develop valid and reliable measures for “fidelity criteria” of an intervention. The identification of critical components is the first step in developing fidelity criteria. Critical components must include both structural components (specifying elements of adherence) and process components (specifying quality of delivery and program differentiation). Program differentiation—the implementation of critical components unique to the intervention—is particularly important because it affects whether evaluation of the outcomes will find an effect of the intervention beyond that of the comparison .
In this paper, we describe our instruments for measuring fidelity, and give evidence for reliability and validity of use for [Study]. The most important of these are the Engineering Logs, for which teachers were prompted to indicate which portions of each [Treatment] or [Comparison] lesson they completed, as well as the duration and date of each lesson, and some indications of how they taught each portion of the lesson, to measure whether teachers were using a pedagogy more in line with [Treatment] or [Comparison]—a measure of program differentiation. Our measure of the quality of teaching implementation, StudentJournalQuality, was coded from a random subset of 6 student engineering journals from each class. In this paper, we provide both qualitative and quantitative analyses demonstrating suitability of these measures.
Lachapelle, C. P., & Cunningham, C. M. (2019, June), Measuring Fidelity of Implementation in a Large-Scale Research Study (RTP) Paper presented at 2019 ASEE Annual Conference & Exposition , Tampa, Florida. 10.18260/1-2--33089
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2019 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015