New Orleans, Louisiana
June 26, 2016
June 26, 2016
August 28, 2016
Perennially, educators, industrialists, social commentators, and politicians call for science, technology, engineering, and mathematics (STEM) instruction that matches an increasingly multifaceted global economy. In the U.S., this new economy presents a growing demand for STEM talent. However, current test-driven curricula and instructional practices in American schools cannot meet the challenge. The latest results from the Trends International Mathematics and Science Study (TIMSS) and the Program for International Student Assessment (PISA) show American students lagging behind other industrialized nations. Additionally, rationales for new approaches can be found in Rising above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future. Interestingly, for the past decade, one high school STEM education program has maintained a commitment to addressing these concerns. This effort has students’ conceiving, building, and launching rockets in an inquiry-, discovery-, and problem-based classroom. This rocket program aims at increasing student interest in STEM by having students use their own efforts to make rockets fly. Students get nine months of hands-on engagement that includes learning from direct and scholarly research, theory development, design brief creation, and post mission analyses. The curriculum also emphasizes soft-skills, like teamwork, communication, and leadership. Teachers work as roving facilitators whose goal is to help students “to see beyond the fire and smoke” and use data to direct effort. These teachers represent about 50 high schools in this Southern state. They are taught to use Socratic teaching methods, with a focus on formulating good questions that lead students to discovery across a range of topics that include those from aeronautics, electrical engineering, and fluid dynamics to those in algebra and calculus. How does one evaluate such a program? This paper describes the evolution of an evaluation strategy for this divergent approach to STEM education. The evaluation strategy included four parts. The first was an exploratory evaluation. This effort was based on past data and interviews with stakeholders. It resulted in a good baseline picture of where the program was in 2014. Second, the evaluators created an implementation plan. Aligned with the exploratory evolution, the implementation plan presented a program logic model, solidified program stakeholder and evaluation team roles, provided preliminary questionnaire maps, and defined evaluation products. It also laid out an agreed upon timeline for deliverables. Third, the strategy included an annual evaluation of student and teacher opinions of their experiences. Finally, the strategy sketched the future architecture for an ongoing, real time assessment system using a custom-designed social networking service. This paper will share the lessons learned that apply to evaluating STEM pedagogy and STEM programs that use nontraditional approaches and assessments.
Burley, H., & Youngblood, T. D., & Yeter, I. H., & Williams, C. M. (2016, June), Engineering an Evaluation for a Growing Rocket Program: Lessons Learned Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.26616
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015