Asee peer logo

BOARD #56A: Student Feedback Analysis Using Natural Language Processing (NLP) and Sentiment Analysis

Download Paper |

Conference

2025 ASEE Annual Conference & Exposition

Location

Montreal, Quebec, Canada

Publication Date

June 22, 2025

Start Date

June 22, 2025

End Date

August 15, 2025

Conference Session

Civil Engineering Division (CIVIL) Poster Session

Tagged Division

Civil Engineering Division (CIVIL)

Page Count

16

DOI

10.18260/1-2--56002

Permanent URL

https://peer.asee.org/56002

Download Count

15

Paper Authors

biography

Sharmin Jahan Badhan independent researcher

visit author page

Sharmin Jahan Badhan is an independent researcher. She received an M.S. in Computer Science from United International University. In addition to her research interests in Artificial Intelligence, Deep Learning, and Natural Language Processing, she is actively engaged in exploring innovative applications of these technologies in construction site environments.

visit author page

biography

Rei Samsami University of New Haven

visit author page

Reihaneh Samsami (Ph.D., P.E.) has joined the University of New Haven (UNH), as a faculty in Construction Management, in Fall 2022. She has contributed to a new MS in Construction Management program development as the program director. She has also been involved in Entrepreneurial Mindset Learning by KEEN and Open Pedagogy at UNH.
In addition to Engineering Education, she has 4+ years of experience in working with Departments of Transportation (DOTs) as a Graduate Research Assistant. Her research is positioned at the intersection of Automated Construction Inspection, Construction Information Modeling, and Data-Driven Decision-Making for project managers, contractors, inspectors, and other project stakeholders.

visit author page

biography

Goli Nossoni University of New Haven

visit author page

Dr. Goli Nossoni is currently an Associate Professor in the Department of Civil and Environmental Engineering at University of New Haven. She received her M.S. and Ph.D. from Michigan State University in civil engineering. In addition to her interest in

visit author page

Download Paper |

Abstract

Academic institutions rely on collecting students’ feedback on their learning experience as one of the indirect measures of learning assessment. This feedback, collected through end-of-term and sometimes mid-term surveys, provides the students with an opportunity to share their viewpoint on various aspects of their learning, including teaching style, teaching effectiveness, course objectives, course material, course evaluation, etc. This feedback is not only used by instructors to improve their teaching style and accommodate the needs of students, but it is also used by institutions as an indicator of the instructors’ teaching effectiveness during their annual review and tenure evaluation. The course surveys typically contain both quantitative (ratings) and qualitative (free text) sections to evaluate the learning process, teaching effectiveness and the instructor’s inclusivity. Use of course evaluations has been studied in the past by many researchers from different perspectives, but most of the studies focused on statistical evaluations of the quantitative responses by different student groups and/or on the best questions to ask on the qualitative part for shorter and more informative responses from students. The qualitative sections are often only used to evaluate the instructor and are not analyzed in any detailed manner due to the complexity of the analytics that would be required. It would be beneficial to both instructors and institutions if a tool was developed that could statistically quantify student responses from the qualitative sections of course evaluations. Educational opinion mining is an approach developed in the last decades to encode students’ feedback using tools such as qualitative text analysis. The objective of this research is to utilize these tools to design a methodology to study student comments and their polarity (positive/negative/neutral) and determine if they are in agreement with students’ responses to the quantitative sections. For example, graduate and higher-level courses typically have better responses to the quantitative sections of course evaluations than lower-level courses. A rubric was developed to categorize student comments as positive, negative, or neutral and used to analyze course evaluations from different engineering student populations at the University of XXX. The results are compared with those in the literature for responses from the quantitative sections of course evaluations.

Badhan, S. J., & Samsami, R., & Nossoni, G. (2025, June), BOARD #56A: Student Feedback Analysis Using Natural Language Processing (NLP) and Sentiment Analysis Paper presented at 2025 ASEE Annual Conference & Exposition , Montreal, Quebec, Canada . 10.18260/1-2--56002

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2025 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015