Montreal, Quebec, Canada
June 22, 2025
June 22, 2025
August 15, 2025
Diversity and NSF Grantees Poster Session
7
10.18260/1-2--55603
https://peer.asee.org/55603
8
Dr. Jason Morphew is currently an assistant professor at Purdue University in Engineering Education and serves as the director of undergraduate curriculum and advanced learning technologies for SCALE. Dr. Morphew is also affiliated with the Center for Advancing the Teaching and Learning of STEM and the INSPIRE research institute for Pre-College Engineering. Dr. Morphew's research focuses on the application of principles of learning derived from cognitive science and the learning sciences to the design of technology-enhanced learning environments. His research builds on design principles to examine the impact of educational technologies on student learning, interest, engagement, and metacognition in STEM.
I am Amirreza Mehrabi, a Ph.D. student in Engineering Education at Purdue University, West Lafayette. Now I am working in computer adaptive testing (CAT) enhancement with AI and analyzing big data with machine learning (ML) under Prof. J. W. Morphew at the ENE department. My master's was in engineering education at UNESCO chair on Engineering Education at the University of Tehran. I pursue Human adaptation to technology and modeling human behavior(with machine learning and cognitive research). My background is in Industrial Engineering (B.Sc. at the Sharif University of Technology and "Gold medal" of Industrial Engineering Olympiad (Iran-2021- the highest-level prize in Iran)). Now I am working as a researcher in the Erasmus project, which is funded by European Unions (1M $_European Union & 7 Iranian Universities) which focus on TEL and students as well as professors' adoption of technology(modern Education technology). Moreover, I cooperated with Dr. Taheri to write the "R application in Engineering statistics" (an attachment of his new book "Engineering probability and statistics.")
Funded by the Improving Undergraduate STEM Education program of National Science Foundation, our project is focused on developing and implementing computerized adaptive testing (CAT) in a freely accessible online platform system named LASSO that encompasses several conceptual inventories across STEM. CAT is an adaptive assessment method that selection of test items based on students’ real-time performance. This adaptive approach allows for precise and efficient measurement of student proficiency (sometimes also referred to as ability). By selecting questions at the appropriate difficulty level for students, the assessment system in LASSO is able to apply several algorithmic models to derive information about student skill mastery, content area learning, and student conceptual profiles. By developing an in-depth and detailed profile for each student, the adaptive testing system is able to provide instructors with individualized insights into student learning, which is particularly valuable for large enrollment introductory STEM courses where instructors are not able to collect this data in real time.
The core of our adaptive testing system uses Item Response Theory (IRT) and Cognitive Diagnostic Models (CDMs) to provide detailed analyses of student proficiency and skill mastery. IRT offers precise metrics by modeling the relationship between item characteristics and student abilities, providing a fine-tuned understanding of how students interact with assessment items. CDMs further enhance this process by identifying the underlying skills students have mastered. CDMs are also able to model content area mastery for content areas such as momentum, energy conservation, two-dimensional kinematics, etc. Further, Transition Diagnostic Classification Models (TDCMs) offer the ability to develop conceptual profiles using the specific incorrect answers students select to identify student misconceptions. These models offer a granular view of the cognitive strengths and weaknesses of students and allows instructors to identify the specific areas where their student need improvement.
While adaptive testing provides instructors with a powerful tool for assessing students, large enrollment classes still present a challenge for providing in the moment instructional interventions at scale. By integrating adaptive learning processes into an adaptive testing platform, our work aims to present a more complete framework for optimizing student outcomes in large enrollment STEM courses. This work in the process explores the next step in our project, which involves transitioning from CAT to adaptive learning. By leveraging the diagnostic insights from IRT and CDMs, we are developing an adaptive learning system that curates personalized learning pathways for each student. This system will select video-based content and instructional materials tailored to individual skill gaps according to their skill mastery profile and abilities. We aim for the outcome to be an engaging, time-efficient, and effective learning experience, with content tailored to each student's ability level and mastery profile. By integrating CAT with adaptive learning, we can create a continuous feedback loop where assessment informs instruction in real-time. This adaptability ensures that each student’s learning path evolves according to their progress, leading to improved academic outcomes and a more personalized educational journey.
Morphew, J., & Mehrabi, A., & Van Dusen, B. (2025, June), BOARD # 243: From Adaptive Testing to Adaptive Learning: An NSF IUSE project Paper presented at 2025 ASEE Annual Conference & Exposition , Montreal, Quebec, Canada . 10.18260/1-2--55603
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2025 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015