Baltimore , Maryland
June 25, 2023
June 25, 2023
June 28, 2023
Computers in Education Division (COED)
15
10.18260/1-2--42566
https://peer.asee.org/42566
215
Dr. Jaejin Hwang, is an Associate Professor of Industrial and Systems Engineering at NIU. His expertise lies in physical ergonomics and occupational biomechanics and exposure assessment. His representative works include the design of VR/AR user interfaces to minimize the physical and cognitive demands of users. He specializes in the measurements of bodily movement as well as muscle activity and intensity to assess the responses to physical and environmental stimuli. In this project, he will lead multimodal behavioral data collection, processing, and analyses to assess children’s learning and affective behaviors.
Graduate student at Northern Illinois University in Industrial and Systems Engineering Department. Working as a Research Assistant for NSF funded project.
The purpose of this three-year project (funded by NSF EHR-ITEST) is to develop an innovative mixed-reality environment that supports early development of computational thinking. Grounded in embodied cognition and social robotics, the project team is designing and studying an environment that combines augmented reality (AR) technology and a physically embodied social robot. In this environment, children in K-2 walk around on a 5x5 chessboard-like grid on a floor mat to help a robot (Linibot) find a path toward a goal while holding a tablet. The robot guides children with instructions, cues, and corrective and motivational feedback. The tablet displays an equivalent map and several AR obstacles for the child to avoid. The specific learning objectives are foundational STEM problem solving skills including the understanding of symbols and sequences that are crosscutting STEM domains, and developing children’s confidence in advanced technology use. Importantly, we unobtrusively assess children’s progresses while they play in the environment by using multimodal behavioral data collection technology such as automated interaction logs capturing their walking distance and time taken to the goal and an optical motion capture system and electromyography (EMG) to precisely collect bodily gestures, postures, and socio-emotional dynamics. We have been developed the mixed-reality environment iteratively over one and a half years, testing our ongoing designs with twenty-five children in informal settings (our lab, a community center, and a STEM showcase event). Each test has had a different focus dependent on the developmental progress of the environment.
In this poster session, we will present the results of our most recent implementation with seventeen boys and girls one-on-one in a local one-day STEM showcase event held two weeks ago. The children were aged six to eleven, and their ethnicities included Caucasian, Asian, and African American. The parents and children voluntarily walked into our booth. After obtaining parental consent, each child played two episodes of the path-finding game: Game 1 taking five to ten minutes and Game 2 taking ten to twenty minutes. Through our observations and conversations with the parents and children, we noticed a great range in children’s abilities and computing experiences. Currently we are analyzing the data from the interaction logs to assess each child’s walking distance and time taken to the goal, while fine-tuning the design of AR-enabled obstacles and the robot’s utterances. Following this, we will analyze the data from the sensor technologies (bodily gestures, postures, and socio-emotional dynamics) and compare these two sets of data to understand how the data sets will enable assessing children’s learning progresses authentically. This poster session will present the outcome of our analyses and provide the implications of this advanced technology enabled environment supporting developmentally appropriate learning and ecologically valid assessment.
Hwang, J., & lee, S., & Kim, Y., & Zaman, M., & Pokhrel, S. (2023, June), Active Project: Supporting Young Children’s Computational Thinking Skills Using a Mixed-Reality Environment Paper presented at 2023 ASEE Annual Conference & Exposition, Baltimore , Maryland. 10.18260/1-2--42566
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2023 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015