Asee peer logo

Creating Capacity to Explore what Students Learn from Reflection Activities: Validating the Knowledge-gain Survey

Download Paper |

Conference

2021 ASEE Virtual Annual Conference Content Access

Location

Virtual Conference

Publication Date

July 26, 2021

Start Date

July 26, 2021

End Date

July 19, 2022

Conference Session

Studies of Classroom Assessment: Exam Wrappers, Equitable Grading, Test Anxiety, and Use of Reflection

Tagged Division

Educational Research and Methods

Page Count

13

DOI

10.18260/1-2--36872

Permanent URL

https://peer.asee.org/36872

Download Count

302

Request a correction

Paper Authors

biography

Kenya Z. Mejia University of Washington

visit author page

Kenya Z. Mejia is a third year PhD student at the University of Washington in the Human Centered Design & Engineering program. Her work focuses on diversity and inclusion in engineering education focusing on engineering design education.

visit author page

biography

Jennifer A. Turns University of Washington

visit author page

Jennifer Turns is a Professor in the Department of Human Centered Design & Engineering at the University of Washington. She is interested in all aspects of engineering education, including how to support engineering students in reflecting on experience, how to help engineering educators make effective teaching decisions, and the application of ideas from complexity science to the challenges of engineering education.

visit author page

Download Paper |

Abstract

This paper reports on the methodological process of validating a survey instrument to measure student learning from reflection activities. Reflection is thought to be a helpful teaching and learning tool. In engineering education, reflection is gaining traction as a tool to help students think about their study habits, exam performance, command of the course content, and team interactions. Yet few validated instruments exist to systematically document what students are learning from reflection experiences. The purpose of this research project is to validate an instrument to capture the knowledge gains of students from doing reflection activities in a course context. Having a validated survey will allow researchers and educators to compare knowledge gains across activities, between classes, and even across institutions. In order to create the instrument, the research team followed the survey validation process. The 72 items, or questions for the survey, were developed using various learning models, such as Bloom’s Taxonomy cognitive and affect domain, and Dee Fink’s Taxonomy of Creating Significant Learning Experiences, to ensure we captured multiple learning opportunities. Students give answers by choosing their level of agreement on a likert scale for each of the items. Our items ask about expected learning outcomes such as “I better understood what had been confusing about a topic” and “I understood how the topics in this course can be applied to the real world,” which relate to course content knowledge and knowledge relevant to their careers. Additionally, our items ask about novel potential learning outcomes such as “I realized the skills I gained [in this context] will help me in my career” and “my confidence as a student increased,” to help students connect their learning in the classroom to the real world and make their own progress in learning more visible. Pilot tests were conducted for comprehensibility and the data collection process is in progress. Exploratory factor analysis will be used to group and reduce the 72 current items. In the exploratory factor analysis, we will focus on undergraduate engineering students who have completed a reflection activity in the past academic school year. We are using a stratified random sampling approach to ensure we document student learning from a diversity of reflection activities. Here, we report the results of an exploratory factor analysis, highlighting the reduced number of questions contributing to the validated survey. We have reduced the number of questions from 72 to 16, as described by four factors: Engineering Self, Course Understandings, Areas for Growth, and Social Impact. We report data on inter-item reliability and plan to conduct follow up studies for the final draft to include concurrent validity, and test-retest reliability. The contribution of this work is a validated survey that will allow the engineering community to learn more about what students learn from doing reflection activities and what settings or types of activities lead to specific characteristics of learning.

Mejia, K. Z., & Turns, J. A. (2021, July), Creating Capacity to Explore what Students Learn from Reflection Activities: Validating the Knowledge-gain Survey Paper presented at 2021 ASEE Virtual Annual Conference Content Access, Virtual Conference. 10.18260/1-2--36872

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2021 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015