Minneapolis, MN
August 23, 2022
June 26, 2022
June 29, 2022
16
10.18260/1-2--41122
https://peer.asee.org/41122
1621
Joseph Mirabelli is an Educational Psychology graduate student at the University of Illinois Urbana-Champaign with a focus in Engineering Education. His work focuses on mentorship, mental health, and retention for STEM students and faculty. He was awarded the 2020 NAGAP Gold Award for Graduate Education Research to study engineering faculty perceptions of graduate student well-being and attrition. Before studying education at UIUC, Joseph earned an MS degree in Physics from Indiana University in Bloomington and a BS in Engineering Physics at UIUC.
Karin Jensen, Ph.D. is a Teaching Associate Professor in bioengineering at the University of Illinois Urbana-Champaign. Her research interests include student mental health and wellness, engineering student career pathways, and engagement of engineering faculty in engineering education research. She was awarded a CAREER award from the National Science Foundation for her research on undergraduate mental health in engineering programs. Before joining UIUC she completed a post-doctoral fellowship at Sanofi Oncology in Cambridge, MA. She earned a bachelor’s degree in biological engineering from Cornell University and a Ph.D. in biomedical engineering from the University of Virginia.
Sara Vohra is an undergraduate studying Bioengineering with a minor in Chemistry at the University of Illinois at Urbana-Champaign. Her interests lie in education as well as medicine with a future career goal as a physician.
Eileen Johnson received her bachelor’s and MS in bioengineering from the University of Illinois at Urbana-Champaign. She previously worked in tissue engineering and genetic engineering throughout her education. During her undergraduate career, she worked with Dr. Brendan Harley developing biomaterial implants for craniomaxillofacial defects and injuries. In graduate school, she worked with Dr. Pablo Perez-Pinera working on new genetic engineering tools. There, she became interested in engineering education after helping develop and teach an online only laboratory class. She currently works as a research associate under Dr. Karin Jensen with a focus on engineering student mental health, retention, and development of resources.
This theory paper on quantitative methods focuses on the use of Exploratory Factor Analysis (EFA) in engineering education research. EFA techniques are fundamental, widespread, and powerful quantitative tools with applications to scale creation, model design, and understanding measures with complicated covariances. Despite the prevalence of EFA methods and tools in engineering education research, best practices and decisions associated with conducting EFA analyses are poorly defined and standards between disciplines often differ. For instance, when determining which items to retain in a large-item scale, there is not a consensus among researchers on cutoff values for factor loadings, or on which order multiple factors below cutoffs should be dropped. To the knowledge of the authors, no set of standards exist or have been adopted from another field for conducting EFA methods in engineering education. Defining these standards can be of significant value to the field, especially to those who are new to engineering education research. In a recent analysis of a novel, 81-item scale with N=624 responses, our team recognized this as an opportunity to compare multiple approaches to conducting an EFA. We performed multiple item retention methods following four different algorithmic procedures, such as identifying an order to removing under-loading versus cross-loading items, and a fifth synthetic procedure using the results from the first four. We then compared the results of performing a full EFA analysis with these procedures, including goodness of fit measures and factor structure. These five analyses were performed such that multiple approaches would triangulate towards strong items and latent factors and therefore yield a more reliable identification of the factor structure. With the intent of generating a discussion of EFA standards and practices within engineering education, we present observations and findings from the different algorithmic procedures of item retention on a large, 81-item scale. The EFA techniques performed average to 51 items and 10 factors retained from the full survey. After presenting the results of these methods, we offer reflections upon our analysis techniques and our suggestions for future best practices in EFA work. We discuss the implications of undergoing such exhaustive analysis processes on both the reliability of quantitative analyses in engineering education research and on the value of comparing similar analysis techniques in teams conducting research collaboratively.
Mirabelli, J., & Jensen, K., & Vohra, S., & Johnson, E. (2022, August), Exploring the Exploratory Factor Analysis: Comparisons and Insights from Applying Five Procedures to Determining EFA Item Retention Paper presented at 2022 ASEE Annual Conference & Exposition, Minneapolis, MN. 10.18260/1-2--41122
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2022 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015