Asee peer logo

Developing common qualitative tools for cross ERC education program evaluation

Download Paper |

Conference

2022 ASEE Annual Conference & Exposition

Location

Minneapolis, MN

Publication Date

August 23, 2022

Start Date

June 26, 2022

End Date

June 29, 2022

Conference Session

Educational Research and Methods (ERM) Division Poster Session

Page Count

12

DOI

10.18260/1-2--41180

Permanent URL

https://peer.asee.org/41180

Download Count

247

Paper Authors

biography

Zhen Zhao Arizona State University, Polytechnic Campus

visit author page

Zhen Zhao is a Ph.D. student in The Polytechnic School at Arizona State University. His research interests include engineering student mentorship and leadership development, engineering research center education and diversity impact evaluation, and engine

visit author page

author page

Megan O'Donnell Arizona State University

author page

Michelle Jordan Arizona State University

author page

Wilhelmina Savenye Arizona State University

author page

Gillian Roehrig University of Minnesota - Twin Cities

biography

Marcus Lyra Arizona State University

visit author page

Engineers are motivated by innovation and new ideas, many scholars have spent their lives in finding and suggesting effective ways of supporting long-life learning in engineering (from K-12 to professional engineers). I am also one of them, specifically, I am passionate about how engineering educators become educators, how they seek teaching-related professional development, and how it is translated into effective engineering courses, using quantitative, qualitative, and mixed-methods analysis.

visit author page

Download Paper |

Abstract

National Science Foundation (NSF) funded Engineering Research Centers (ERC) are required to develop and implement education and outreach opportunities related to their core technical research topics to broaden participation in engineering and create partnerships between industry and academia. Additionally, ERCs must include an independent evaluation of their education and outreach programming to assess their performance and impacts. To date, each ERC’s evaluation team designs its instruments/tools and protocols for evaluation, resulting in idiosyncratic and redundant efforts. Nonetheless, there is much overlap among the evaluation topics, concepts, and practices, suggesting that the ERC evaluation and assessment community might benefit from having a common set of instruments and protocols. ERCs’ efforts could then be better spent developing more specific, sophisticated, and time-intensive evaluation tools to deepen and enrich the overall ERC evaluation efforts. The implementation of such a suite of instruments would further allow each ERC to compare its efforts to those across other ERCs as one data point for assessing its effectiveness and informing its improvement efforts. Members of a multi-ERC collaborative team, funded by the NSF, have been leading a project developing a suite of common instruments and protocols which contains both quantitative and qualitative tools. This paper reports on the development of a set of qualitative instruments that, to date, includes the following: (a) a set of interview/focus group protocols intended for various groups of ERC personnel, centered around five common topics/areas, and (b) rubrics for summer program participants' verbal poster/presentations and their written poster/slide deck presentation artifacts. The development process is described sequentially, beginning with a review of relevant literature and existing instruments, followed by the creation of an initial set of interview questions and rubric criteria. The initial versions of the tools were then pilot-tested with multiple ERCs. Feedback sessions with education/evaluation leaders of those piloting ERCs were then conducted, through which further revision efforts were made.

Zhao, Z., & O'Donnell, M., & Jordan, M., & Savenye, W., & Roehrig, G., & Lyra, M. (2022, August), Developing common qualitative tools for cross ERC education program evaluation Paper presented at 2022 ASEE Annual Conference & Exposition, Minneapolis, MN. 10.18260/1-2--41180

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2022 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015