Asee peer logo

Streamlining the Process of Evaluating the Education and Diversity Impacts across Engineering Research Centers

Download Paper |

Conference

2020 ASEE Virtual Annual Conference Content Access

Location

Virtual On line

Publication Date

June 22, 2020

Start Date

June 22, 2020

End Date

June 26, 2021

Conference Session

Instruments and Methods for Studying Student Experiences and Outcomes

Tagged Division

Educational Research and Methods

Page Count

22

DOI

10.18260/1-2--35212

Permanent URL

https://peer.asee.org/35212

Download Count

165

Request a correction

Paper Authors

biography

Zhen Zhao Arizona State University

visit author page

Zhen Zhao is a Ph.D. student at Arizona State University in the Fulton Schools of Engineering Polytechnic School. He earned a B.S. in Computer Science and an M.S. in Software Engineering, both from Xi'an Jiaotong University in China. He also received an M.S.E in Industrial Engineering from Arizona State University. Zhen's research interests include engineering student mentor-ship ability development, engineering research center education and diversity impact evaluation, engineering student adaptability development, and engineering graduate student attrition. Combining the strength of mathematical modeling and statistics, 5+ years experience in collegiate teaching, Zhen is passionate and dedicated to preparing future engineering workforce.

visit author page

biography

Adam R. Carberry Arizona State University Orcid 16x16 orcid.org/0000-0003-0041-7060

visit author page

Dr. Adam Carberry is an associate professor at Arizona State University in the Fulton Schools of Engineering Polytechnic School. He earned a B.S. in Materials Science Engineering from Alfred University, and received his M.S. and Ph.D., both from Tufts University, in Chemistry and Engineering Education respectively. His research investigates the development of new classroom innovations, assessment techniques, and identifying new ways to empirically understand how engineering students and educators learn. He is currently the chair of the Research in Engineering Education Network (REEN) and an associate editor for the Journal of Engineering Educaiton (JEE). Prior to joining ASU he was a graduate student research assistant at the Tufts’ Center for Engineering Education and Outreach.

visit author page

biography

Alison Cook-Davis Arizona State University

visit author page

Dr. Alison Cook-Davis is Assistant Director for Program Evaluation at the Arizona State University’s Office of Evaluation and Educational Effectiveness (UOEEE). She has a BA in Psychology, MS in Social Psychology, MLS Legal Studies, and a Ph.D. in Experimental Social Psychology. Prior to joining UOEEE, she supported the research and program evaluation efforts of Maricopa County Adult Probation Department, coordinated and executed the research and program evaluation for a large Department of Justice Second Chance Act grant. These efforts included monitoring, assessing, and evaluating the impacts of program outcomes. Since joining the UOEEE in 2015, Dr. Cook-Davis has led research and evaluation activities for over 50 separate grant-funded programs or initiatives funded by the National Science Foundation, U.S. Department of Education, U.S. Department of State, U.S. Department of Agriculture, National Institutes of Health, and The Kern Family Foundation. These projects have focused on the evaluation of student success, outreach impacts, innovative learning techniques, and STEM-related interventions and curricula.

visit author page

biography

Jean S. Larson Arizona State University Orcid 16x16 orcid.org/0000-0003-4898-2149

visit author page

Jean Larson, Ph.D., is the Educational Director for the NSF-funded Engineering Research Center for Bio-mediated and Bio-inspired Geotechnics (CBBG), and Assistant Research Professor in both the School of Sustainable Engineering and the Built Environment and the Division of Educational Leadership and Innovation at Arizona State University. She has a Ph.D. in Educational Technology, postgraduate training in Computer Systems Engineering, and many years of experience teaching and developing curriculum in various learning environments. She has taught technology integration and teacher training to undergraduate and graduate students at Arizona State University, students at the K-12 level locally and abroad, and various workshops and modules in business and industry. Dr. Larson is experienced in the application of instructional design, delivery, evaluation, and specializes in eLearning technologies for training and development. Her research focuses on the efficient and effective transfer of knowledge and learning techniques, innovative and interdisciplinary collaboration, and strengthening the bridge between K-12 learning and higher education in terms of engineering content.

visit author page

biography

Michelle Jordan Arizona State University

visit author page

Michelle Jordan is as associate professor in the Mary Lou Fulton Teachers College at Arizona State University. She also serves as the Education Director for the QESST Engineering Research Center. Michelle’s program of research focuses on social interactions in collaborative learning contexts. She is particularly interested in how students navigate communication challenges as they negotiate complex engineering design projects. Her scholarship is grounded in notions of learning as a social process, influenced by complexity theories, sociocultural theories, sociolinguistics, and the learning sciences.

visit author page

biography

Wendy M. Barnard Arizona State University

visit author page

Wendy Barnard is an Assistant Research Professor and Director of the College Research and Evaluation Services Team (CREST) at Arizona State University. Dr. Barnard received her Ph.D. from the University of Wisconsin-Madison, where she focused on the impact of early education experiences and parent involvement on long-term academic achievement. Her research interests include evaluation methodology, longitudinal research design, STEM educational efforts, and the impact of professional development on teacher performance. Currently, she works on evaluation efforts for grants funded by National Science Foundation, US Department of Education, local foundation, and state grants.

visit author page

biography

Megan O'Donnell Arizona State University

visit author page

Megan O’Donnell is a Research Professional in the College Research and Evaluation Services Team (CREST). Dr. O’Donnell received her Ph.D. from Arizona State University, where she focused on risk and resiliency processes in Mexican American adolescents. Her current research and evaluation interests include evaluation methodology, mixed methods design, evaluation of engineering and STEAM education programs. Currently, she works on evaluation efforts for the US Department of Education, National Science Foundation, local foundations, and state grants.

visit author page

biography

Wilhelmina C. Savenye Arizona State University

visit author page

Dr. Wilhelmina "Willi" C. Savenye is a Professor Emeritus of Learning, Design and Technologies / Educational Technology at Arizona State University. She is a former Education Director, and currently serves as Senior Education Advisor, for the NSF Engineering Research Center for Bio-mediated and Bio-inspired Geotechnics (CBBG). She previously taught at the University of Texas at Austin and San Diego State University. She earned her M.Ed. and Ph.D. in Educational Technology from ASU, and B.A/ in Anthropology from the University of Washington. Dr. Savenye focuses on instructional design and evaluation of technology-based and online learning systems, employing both quantitative and qualitative research methodologies. She has published over 70 articles and book chapters; made over 140 conference presentations and workshops in the U.S., Europe and Asia; been awarded numerous grants, and has produced many digital learning programs. She is Editor Emeritus of the Journal of Applied Instructional Design. She has served on the editorial boards of journals including Educational Technology: Research and Development and the Quarterly Review of Distance Education, and reviews for additional journals. She served on the editorial board for the Encyclopedia of Educational Technology and has held elected leadership positions.
Dr. Savenye’s instructional design and evaluation work has been conducted in such diverse settings as engineering education, school districts, museums, botanical gardens, zoos, universities, corporations, and Army tank maintenance training.

visit author page

Download Paper |

Abstract

The Engineering Research Centers (ERCs), funded by the National Science Foundation (NSF), play an important role in improving engineering education, bridging engineering academia and broad communities, and promoting a culture of diversity and inclusion. Each ERC must partner with an independent evaluation team to annually assess their performance and impact on progressing education, connecting community, and building diversified culture. This evaluation is currently performed independently (and in isolation), which leads to inconsistent evaluations and a redundant investment of ERCs’ resources into such tasks (e.g. developing evaluation instruments). These isolated efforts by ERCs to quantitatively evaluate their education programs also typically lack adequate sample size within a single center, which limits the validity and reliability of the quantitative analyses.

Three ERCs, all associated with a large southwest university in the United States, worked collaboratively to overcome sample size and measure inconsistency concerns by developing a common quantitative instrument that is capable of evaluating any ERC’s education and diversity impacts. The instrument is the result of a systematic process with comparing and contrasting each ERC’s existing evaluation tools, including surveys and interview protocols. This new, streamlined tool captures participants’ overall experience as part of the ERC by measuring various constructs including skillset development, perception of diversity and inclusion, future plans after participating in the ERC, and mentorship received from the ERC. Scales and embedded items were designed broadly for possible use with both yearlong (e.g. graduate and undergraduate student, and postdoctoral scholars) and summer program (Research Experience for Undergraduates, Research Experience for Teachers, and Young Scholar Program) participants. The instrument was distributed and tested during Summer 2019 with participants in the summer programs from all three ERCs. The forthcoming paper will present the new common cross-ERC evaluation instrument, demonstrate the effort of collecting data across all three ERCs, present preliminary findings, and discuss collaborative processes and challenges. The preliminary implication for this work is the ability to directly compare educational programs across ERCs. The authors also believe that this tool can provide a fast start for new ERCs on how to evaluate their educational programs.

Zhao, Z., & Carberry, A. R., & Cook-Davis, A., & Larson, J. S., & Jordan, M., & Barnard, W. M., & O'Donnell, M., & Savenye, W. C. (2020, June), Streamlining the Process of Evaluating the Education and Diversity Impacts across Engineering Research Centers Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--35212

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015