Asee peer logo

Toward a Quantitative Engagement Monitor for STEM Education

Download Paper |

Conference

2021 ASEE Virtual Annual Conference Content Access

Location

Virtual Conference

Publication Date

July 26, 2021

Start Date

July 26, 2021

End Date

July 19, 2022

Conference Session

NSF Grantees Poster Session

Tagged Topic

NSF Grantees Poster Session

Page Count

13

Permanent URL

https://peer.asee.org/37916

Download Count

17

Request a correction

Paper Authors

biography

Aly A. Farag University of Louisville

visit author page

Aly Farag, Fellow, IEEE and IAPR: received B.S. in EE from Cairo Univ. M.S. in Bioengineering from the Ohio State and the Univ. of Michigan, and PhD in EE from Purdue. He is a Prof. of ECE at the Univ. of Louisville, and director of the Computer Vision & Image Processing Laboratory, focusing on research and teaching in computer vision, biometrics and biomedical imaging. He introduced over 13 new courses into the ECE curriculum, authored over 400 papers, edited two volumes on deformable models and a textbook on Biomedical Image Analysis (Cambridge Univ. Press, 2014). He graduated over 70 MS and PhD students, and mentored over 20 postdoctoral researchers. He holds seven US patents on object modeling, computer-aided diagnosis, and visualization. He was lead editor of IEEE-TIFS special issue on Face Recognition in the Wild (December 2014), and co-general chair of ICIP-2009. He is recipient of the University top Awards: Research (1999), Teaching (2009, 2011) and Trustees (2015).

visit author page

biography

Asem Ali University of Louisville Orcid 16x16 orcid.org/0000-0002-0503-4838

visit author page

Asem M. Ali received the M.S. degree in electrical engineering from Assiut University, Asyut, Egypt, in 2002, and the Ph.D. degree in computer engineering from the University of Louisville, Louisville, KY, USA, in 2008, where he was a Post-Doctoral Researcher with the Computer Vision and Image Processing Laboratory from 2008 to 2011. He was an Assistant Professor with the Department of Electrical Engineering, Assiut University from 2011 to 2015. He is currently a Research Scientist with the Computer Vision and Image Processing Laboratory. His research interests include image analysis, machine learning, face recognition, and facial expressions and emotions recognition. He has authored over 40 papers in journals and conferences.

visit author page

author page

Islam Alkabbany University of Louisville

biography

James Christopher Foreman University of Louisville Orcid 16x16 orcid.org/0000-0001-6756-2890

visit author page

Asst. Professor at University of Louisville, previous appointment at Purdue University. Teaching calculus, power and energy, and industrial control systems related courses. Research in artificial neural networks, expert systems, and new methods of teaching math/calculus. 15 years in industry control systems and power generation industry prior to academic career.

visit author page

biography

Tom Tretter University of Louisville

visit author page

Thomas Tretter is professor of science education and director of the Center for Research in Mathematics and Science Teacher Development as well as director of the Gheens Science Hall and Rauch Planetarium at the University of Louisville. His scholarship includes collaborative efforts with science and engineering faculty targeting retention of STEM majors in entry-level STEM courses.

visit author page

biography

Marci S. DeCaro University of Louisville Orcid 16x16 orcid.org/0000-0001-6753-0725

visit author page

Marci DeCaro is an Associate Professor in the Department of Psychological and Brain Sciences at the University of Louisville. DeCaro's research applies principles of cognitive psychology to study learning and performance in educational contexts.

visit author page

biography

Nicholas Carl Hindy University of Louisville

visit author page

Assistant Professor in the Department of Psychological and Brain Sciences at the University of Louisville

visit author page

Download Paper |

Abstract

Nearly 50% of college students enrolled into engineering programs in the US drop out after the first year, which covers basic STEM courses as pre-requisites for following courses in various engineering disciplines. A similar trend, but at lower rate, persists with subsequent STEM coursework. This leads to substantial attrition which is costly to students, their families, and society. US institutions have tackled the issue of attrition through advances in the education environment and in the delivery mechanisms of STEM subjects. Despite the progress made, engineering attrition due to performance deficiencies in STEM courses persists across US educational institutions, and is most evident among low income, minorities and first-generation college students. Students’ engagement (or lack of) in the classroom is a strong indicator for performance in STEM subjects.

Using NSF IUSE Program funding, the investigators proposed to: 1) create an experimental biometric sensor network (BSN) using non-intrusive sensors to capture the emotional and behavioral engagements of the students in the classroom; 2) use computer vision methodologies to extract robust features to quantify the emotional and behavioral engagements, and explore their correlation with cognitive engagements; 3) deploy the BSN in an early engineering STEM class to obtain real-time data for design, test and validate an engagement monitor which would display the results on a teacher’s dashboard. A prototype of the BSN based on wireless webcams and wristbands has been created and interfaced with a sever, through which students’ facial and physiological information were captured in real-time during the lectures. The facial action coding system (FACS) were identified using a selective facial part model (SPM) developed at the CVIP Lab. Videos have been annotated by an experienced human observer. Features corresponding to the FACS were extracted using a novel convolutional neural network (CNN) to obtain robust descriptors using deep learning, and optimized to correspond to three levels of engagement: highly-engaged, engaged and not engaged. More than 10,000 frames were used in the training of the CNN, and testing was performed using real videos from two STEM classes. Preliminary results are very promising. The BSN enables complete autonomy of the data, and allows for including and excluding students without coercion. The BSN can also be deployed online.

A GUI resembling a dashboard has been created to display the engagement levels in real-time. Current efforts are focusing on automating the annotation, and establishing a coherence between the behavioral and emotional engagement for data reduction. Efforts are also directed towards fusing of emotional and behavioral engagement measures and their correspondence to the cognitive engagement of the students.

Farag, A. A., & Ali, A., & Alkabbany, I., & Foreman, J. C., & Tretter, T., & DeCaro, M. S., & Hindy, N. C. (2021, July), Toward a Quantitative Engagement Monitor for STEM Education Paper presented at 2021 ASEE Virtual Annual Conference Content Access, Virtual Conference. https://peer.asee.org/37916

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2021 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015