Asee peer logo

Active Project: Supporting Young Children’s Computational Thinking Skills Using a Mixed-Reality Environment

Download Paper |

Conference

2023 ASEE Annual Conference & Exposition

Location

Baltimore , Maryland

Publication Date

June 25, 2023

Start Date

June 25, 2023

End Date

June 28, 2023

Conference Session

COED: Computing in K-12 / Early Childhood Education

Tagged Division

Computers in Education Division (COED)

Page Count

15

DOI

10.18260/1-2--42566

Permanent URL

https://peer.asee.org/42566

Download Count

215

Paper Authors

biography

Jaejin Hwang Northern Illinois University

visit author page

Dr. Jaejin Hwang, is an Associate Professor of Industrial and Systems Engineering at NIU. His expertise lies in physical ergonomics and occupational biomechanics and exposure assessment. His representative works include the design of VR/AR user interfaces to minimize the physical and cognitive demands of users. He specializes in the measurements of bodily movement as well as muscle activity and intensity to assess the responses to physical and environmental stimuli. In this project, he will lead multimodal behavioral data collection, processing, and analyses to assess children’s learning and affective behaviors.

visit author page

author page

sungchul lee Sun Moon University, South Korea

author page

Yanghee Kim

biography

Mobasshira Zaman Northern Illinois University

visit author page

Graduate student at Northern Illinois University in Industrial and Systems Engineering Department. Working as a Research Assistant for NSF funded project.

visit author page

author page

Sobhit Pokhrel

Download Paper |

Abstract

The purpose of this three-year project (funded by NSF EHR-ITEST) is to develop an innovative mixed-reality environment that supports early development of computational thinking. Grounded in embodied cognition and social robotics, the project team is designing and studying an environment that combines augmented reality (AR) technology and a physically embodied social robot. In this environment, children in K-2 walk around on a 5x5 chessboard-like grid on a floor mat to help a robot (Linibot) find a path toward a goal while holding a tablet. The robot guides children with instructions, cues, and corrective and motivational feedback. The tablet displays an equivalent map and several AR obstacles for the child to avoid. The specific learning objectives are foundational STEM problem solving skills including the understanding of symbols and sequences that are crosscutting STEM domains, and developing children’s confidence in advanced technology use. Importantly, we unobtrusively assess children’s progresses while they play in the environment by using multimodal behavioral data collection technology such as automated interaction logs capturing their walking distance and time taken to the goal and an optical motion capture system and electromyography (EMG) to precisely collect bodily gestures, postures, and socio-emotional dynamics. We have been developed the mixed-reality environment iteratively over one and a half years, testing our ongoing designs with twenty-five children in informal settings (our lab, a community center, and a STEM showcase event). Each test has had a different focus dependent on the developmental progress of the environment.

In this poster session, we will present the results of our most recent implementation with seventeen boys and girls one-on-one in a local one-day STEM showcase event held two weeks ago. The children were aged six to eleven, and their ethnicities included Caucasian, Asian, and African American. The parents and children voluntarily walked into our booth. After obtaining parental consent, each child played two episodes of the path-finding game: Game 1 taking five to ten minutes and Game 2 taking ten to twenty minutes. Through our observations and conversations with the parents and children, we noticed a great range in children’s abilities and computing experiences. Currently we are analyzing the data from the interaction logs to assess each child’s walking distance and time taken to the goal, while fine-tuning the design of AR-enabled obstacles and the robot’s utterances. Following this, we will analyze the data from the sensor technologies (bodily gestures, postures, and socio-emotional dynamics) and compare these two sets of data to understand how the data sets will enable assessing children’s learning progresses authentically. This poster session will present the outcome of our analyses and provide the implications of this advanced technology enabled environment supporting developmentally appropriate learning and ecologically valid assessment.

Hwang, J., & lee, S., & Kim, Y., & Zaman, M., & Pokhrel, S. (2023, June), Active Project: Supporting Young Children’s Computational Thinking Skills Using a Mixed-Reality Environment Paper presented at 2023 ASEE Annual Conference & Exposition, Baltimore , Maryland. 10.18260/1-2--42566

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2023 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015