Asee peer logo

A Comparison of Two Scenario-Based Assessments of Systems Thinking

Download Paper |

Conference

2022 ASEE Annual Conference & Exposition

Location

Minneapolis, MN

Publication Date

August 23, 2022

Start Date

June 26, 2022

End Date

June 29, 2022

Conference Session

Systems Engineering Division Technical Session 1

Page Count

18

DOI

10.18260/1-2--40721

Permanent URL

https://peer.asee.org/40721

Download Count

399

Request a correction

Paper Authors

biography

Siddhant Joshi Purdue University at West Lafayette (COE)

visit author page

Siddhant, from Pune, India is a doctoral student pursuing his Ph.D. in the School of Engineering Education at Purdue University. Prior to starting his Ph.D., Siddhant completed his M.S. in Aeronautics and Astronautics from Purdue University and a B.E. in Mechanical Engineering from MIT World Peace University. To complement his academic experience, Siddhant has a year-long industry experience working as a Lean and Operational Excellence trainee at Sandvik Asia. At Purdue University, Siddhant is also an instructor for two courses at the Gifted Education Research and Resource Institute Summer Residential program and has recently introduced a new course for aspiring engineering students. Apart from academics, Siddhant currently serves as the Treasurer of the American Society of Engineering Education - Purdue Chapter and is a member of the Graduate Student Advisory Council with focus on ensuring a better engineering experience for undergraduate and graduate students.

visit author page

biography

Kirsten Davis Purdue University at West Lafayette (COE)

visit author page

Kirsten Davis is an assistant professor in the School of Engineering Education at Purdue University. Her research explores the intentional design and assessment of global engineering programs, student development through experiential learning, and approaches for teaching and assessing systems thinking skills. Kirsten holds a B.S. in Engineering & Management from Clarkson University and an M.A.Ed. in Higher Education, M.S. in Systems Engineering, and Ph.D. in Engineering Education, all from Virginia Tech.

visit author page

author page

Lori Czerwionka Purdue University at West Lafayette (PPI)

author page

Elisa Camps Troncoso Purdue University at West Lafayette (PPI)

author page

Francisco Montalvo

Download Paper |

Abstract

Engineers face complex and multidisciplinary problems in the modern work environment. To understand and solve these complex problems, engineers require systems thinking skills that allow them to consider the interconnected technical and contextual factors. Therefore, it is important to provide engineering students with opportunities to develop these skills during their education. A part of this process is developing assessment approaches that can help instructors measure students’ systems thinking ability. A variety of approaches have been used in the literature to assess the development of systems thinking, including surveys, interviews, design projects, and scenario-based instruments. Scenario-based assessments can offer a more in-depth view of student learning than typical surveys while also being faster to analyze than open-ended data such as interviews. However, a range of scenario-based assessments that are available claim to assess similar skills, making it challenging to identify which fits the needs of a particular educational context. To help address this challenge, we compared two scenario-based assessments: the Village of Abeesee scenario [1] and the Energy Conversion Playground (ECP) design task [2], to understand concepts of systems thinking emphasized by each instrument and how students’ scores on the assessments are related. The participants in this study were 19 undergraduate engineering students enrolled in an interdisciplinary humanities-based engineering course in Spring 2021. We administered both scenario-based assessments at the start and end of the semester to examine the change in students’ scores over time. We then compared the assessment results from each instrument by examining average scores for each of the systems thinking dimensions and also individual total scores on each assessment. Lastly, we compared the experience of scoring the assessments from the perspective of the instructor or researcher using the assessment. Based on our findings, we make recommendations about when an instructor might choose to use one assessment or the other. Our results can inform future research and assessment projects that aim to assess students’ systems thinking skills by comparing both student outcomes and instructor experience for these scenario-based assessments. Keywords: systems thinking, mixed methods, scenario-based assessments, humanities informed engineering

Joshi, S., & Davis, K., & Czerwionka, L., & Camps Troncoso, E., & Montalvo, F. (2022, August), A Comparison of Two Scenario-Based Assessments of Systems Thinking Paper presented at 2022 ASEE Annual Conference & Exposition, Minneapolis, MN. 10.18260/1-2--40721

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2022 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015