Seattle, Washington
June 14, 2015
June 14, 2015
June 17, 2015
978-0-692-50180-1
2153-5965
First-Year Programs
11
26.304.1 - 26.304.11
10.18260/p.23643
https://peer.asee.org/23643
753
Farshid Marbouti is currently pursuing his Ph.D. in Engineering Education at Purdue University. His research interest is first-year engineering and specifically using learning analytics to improve first-year engineering students' success. He completed his M.A. in Educational Technology and Learning Design at Simon Fraser University in Canada, and his B.S. and M.S. in computer engineering in Iran.
Heidi A. Diefes-Dux is a Professor in the School of Engineering Education at Purdue University. She received her B.S. and M.S. in Food Science from Cornell University and her Ph.D. in Food Process Engineering from the Department of Agricultural and Biological Engineering at Purdue University. She is a member of Purdue’s Teaching Academy. Since 1999, she has been a faculty member within the First-Year Engineering Program, teaching and guiding the design of one of the required first-year engineering courses that engages students in open-ended problem solving and design. Her research focuses on the development, implementation, and assessment of modeling and design activities with authentic engineering contexts. She is currently a member of the educational team for the Network for Computational Nanotechnology (NCN).
Dr. Johannes Strobel is Director, Educational Outreach Programs and Associate Professor, Engineering & Education at Texas A&M, College Station. He received his M.Ed. and Ph.D. in Information Science & Learning Technologies from the University of Missouri. His research/teaching focuses on engineering as an innovation in pre-K-12 education, STEM education policy, how to support teachers and students' academic achievements through engineering, engineering 'habits of mind' and empathy and care in engineering. He has published more than 140 journal articles and proceedings papers in engineering education and educational technology and is the inaugural editor of the Journal of Pre-College Engineering Education Research.
Building course-specific prediction models to identify at-risk studentsThe first step in helping students who may fail a course is to identify them as early in thesemester as possible. Prediction of success can help the course instructor to identify at riskstudents and help them to be successful in the course. In an attempt to predict students’ gradesearly in the semester, some instructors use course syllabi grading criteria and apply it to theavailable performance information in order to calculate an early grade for students. This methodmay only be useful closer to the end of semester when the majority of performance informationis available and it may be extremely inaccurate at the beginning of the semester. Using suchinaccurate methods at the beginning of semester can result in wrong predictions and lead tomistrust of students in the predictions.With the use of predictive modeling techniques, it is possible to better predict students’ successin a course. A predictive model can be used as an early warning system, which predicts students’success in courses and informs both the instructor and the students of their performance. Use ofan early warning system in a course, along with guidelines on how to succeed in the course, canincrease students’ success in the course.One common problem with early warning systems, which are currently being used, is that theytypically employ a general model that cannot address the complexity of all courses. Over the pastdecade, more instructors are trying to add more complex learning objectives and using activelearning strategies instead of the traditional extensive lecturing. The new pedagogies are usuallyimplemented along with new assessment methods that do not fit the traditional homework andexam framework. Thus, the assessed components can vary largely from one course to another.Therefore, using one model for different courses can significantly reduce accuracy of the model.In this study, we built three models to identify at-risk students in a non-traditional large first-yearengineering course with ~1700 students at three important times of the semester according toacademic calendar. Then the models were optimized for identifying at-risk students. The modelswere able to identify 79% of at-risk students at week 2 (last day to drop a course without itappearing on record), 90% at week 4 (last day to withdraw from a course with a grade of W), and98% at week 9 (last day to withdraw a course with a grade of W of WF). This high accuracyillustrates the value of creating course specific prediction models instead of generic ones.
Marbouti, F., & Diefes-Dux, H. A., & Strobel, J. (2015, June), Building Course-Specific Regression-based Models to Identify At-risk Students Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.23643
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015