Asee peer logo

Design-based Evaluation: A Novel Evaluation Approach to Examine Designed Programs in Engineering Education

Download Paper |

Conference

2019 ASEE Annual Conference & Exposition

Location

Tampa, Florida

Publication Date

June 15, 2019

Start Date

June 15, 2019

End Date

June 19, 2019

Conference Session

ERM Technical Session 22: Perspectives and Evaluation of Engineering Design Education

Tagged Division

Educational Research and Methods

Page Count

15

DOI

10.18260/1-2--32609

Permanent URL

https://peer.asee.org/32609

Download Count

437

Request a correction

Paper Authors

biography

Lori C. Bland George Mason University Orcid 16x16 orcid.org/0000-0003-4411-634X

visit author page

Lori C. Bland, Ph.D., is an associate clinical professor of curriculum and research, and the Director of Curriculum, Center for Gifted Education at The College of William and Mary. She teaches courses in program evaluation, educational assessment, educational psychology, data-driven decision-making, and gifted education. Bland received her Ph.D. in Educational Psychology from the University of Virginia. Her current research focuses on assessing learning and professional outcomes in formal and informal STEM learning environments; how data are used to inform decision-making; and the uses of different research, evaluation, and assessment methods to solve educational problems.

visit author page

biography

Margret Hjalmarson George Mason University

visit author page

Margret Hjalmarson is a Professor in the Graduate School of Education at George Mason University. Her research interests include engineering education, mathematics education, faculty development and mathematics teacher leadership.

visit author page

biography

Anastasia P. Samaras George Mason University

visit author page

ANASTASIA P. SAMARAS is Professor of Education in the College of Education and Human Development at George Mason University, USA. She is an educational researcher and pedagogical scholar with signature work in self-study research methodology including co-editor of Polyvocal Professional Learning through Self-Study Research (2015) and author of Self-Study Teacher Research (2011) and lead editor of Learning Communities In Practice (2008). She is recipient of the Dissertation Research Award, University of Virginia, the Outstanding Scholar Award, University of Maryland, a Fulbright Scholar, and a Visiting Self-study Scholar. She served as chair of S-STEP from 2013-2015 and is a current Co-PI of two National Science Foundation (NSF) funded grants: Designing Teaching: Scaling up the SIMPLE Design Framework for Interactive Teaching Development and a research initiation grant: Student-directed differentiated learning in college-level engineering education. Her research centers on facilitating and studying her role in faculty development self-study collaboratives.

visit author page

biography

Jill K. Nelson George Mason University

visit author page

Jill Nelson is an associate professor in the Department of Electrical and Computer Engineering at George Mason University. She earned a BS in Electrical Engineering and a BA in Economics from Rice University in 1998. She attended the University of Illinois at Urbana-Champaign for graduate study, earning an MS and PhD in Electrical Engineering in 2001 and 2005, respectively. Dr. Nelson's research focus is in statistical signal processing, specifically detection and estimation for applications in target tracking and physical layer communications. Her work on target detection and tracking is funded by the Office of Naval Research. Dr. Nelson is a 2010 recipient of the NSF CAREER Award. She is a member of Phi Beta Kappa, Tau Beta Pi, Eta Kappa Nu, and the IEEE Signal Processing, Communications, and Education Societies.

visit author page

Download Paper |

Abstract

The purpose for this theoretical paper is to introduce design-based evaluation (DBE) as a novel evaluation approach that can be useful for the engineering education research community. To begin, we define DBE and describe the salient characteristics and elements of the DBE approach. Then, we explain the reasons why DBE is a unique evaluation approach. We link DBE as an evaluation approach to examine design-based research projects and offer a discussion of why current evaluation approaches are not sufficient for design-based research. We also compare and contrast current evaluation approaches with DBE. Finally, we provide a brief description of an effective application of DBE.

The motivation and background for this theoretical paper was grounded within identifying an adequate evaluation approach for a National Science Foundation funded faculty development project (Authors, XXXX). The purpose for this design-based research project was to support and examine undergraduate STEM faculty change processes toward adopting, implementing, examining, and writing about their interactive teaching strategies.

Design-based research is a research methodology that arose from the learning sciences to examine innovative design in educational programs. Kelly et al. (2008) describe the goal of design-based research as the intersection of design processes and research methods to study learning and teaching. Design-based research exists at the juxtaposition of multiple competing processes, such as the dynamic (design) versus the static (research), the creative (design) versus the canon (research), and real-world problem-solving (design) versus theory-building (research). Design-based research is an apt “methodological stance” for research in engineering because of the role of design in solving real world engineering problems (Author, 2008, p. 96).

The evaluation of dynamically designed research poses challenges for evaluators (Author, 2017). Extant evaluation approaches, such as evaluation approaches that examine process (e.g., Stufflebeam, 2005), could have been applied to this project. However, extant evaluation approaches did not adequately capture the elements of this design-based research project. For example, an underlying assumption of this project was that faculty would design their own learning trajectory, and hence faculty processes and outcomes would necessarily vary. The evaluated study was intentionally designed to allow for, and indeed encourage, variation in both the processes evaluated and faculty outcomes. More specifically, there was variation within the program design that encouraged variation within site implementation. The project intention was to study the nature of the variation within the faculty change processes and the unique types of outcomes attained by faculty.

This type of study is not easily addressed by current evaluation approaches. Extant evaluation approaches examine fidelity of implementation, but do not consider that variations in implementation are necessarily acceptable. account for variation in the processes or outcomes. Rather, current evaluation approaches examine implementation and variation within a/the given outcome(s) of a treatment and specifically typically the degree attained of a given outcome. DBE allowed us to address variation across and within processes and outcomes to determine the design that worked best from the different implementations. DBE can be used to evaluate design and changes within engineering education.

Bland, L. C., & Hjalmarson, M., & Samaras, A. P., & Nelson, J. K. (2019, June), Design-based Evaluation: A Novel Evaluation Approach to Examine Designed Programs in Engineering Education Paper presented at 2019 ASEE Annual Conference & Exposition , Tampa, Florida. 10.18260/1-2--32609

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2019 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015