Asee peer logo

Applying Natural Language Processing Techniques to an Assessment of Student Conceptual Understanding

Download Paper |


2016 ASEE Annual Conference & Exposition


New Orleans, Louisiana

Publication Date

June 26, 2016

Start Date

June 26, 2016

End Date

August 28, 2016





Conference Session

Works in Progress: Assessment and Research Tools

Tagged Division

Educational Research and Methods

Page Count




Permanent URL

Download Count


Request a correction

Paper Authors


Christian Anderson Arbogast Oregon State University

visit author page

Christian Arbogast is a graduate student in the School of Mechanical, Industrial, and Manufacturing Engineering at Oregon State University. His academic and research interests include adapting computer science techniques to supplement traditional qualitative analysis and the mechanical design process.

visit author page


Devlin Montfort Oregon State University

visit author page

Dr. Montfort is an Assistant Professor in the School of Chemical, Biological and Environmental Engineering at Oregon State University

visit author page

Download Paper |


Applying Natural Language Processing Techniques to an Assessment of Student Conceptual Understanding

This work in progress describes an approach to enriching a qualitative evaluation of students’ conceptual understanding by integrating assessment tools developed in the context of Computational Linguistics. Evaluating students’ conceptual understanding remains a significant challenge in engineering education and has led to the adoption of a variety of survey assessment tools, such as concept inventories. These existing tools typically compare conceptual understanding against an established (though often tacit) standard of expertise. This can help practicing educators categorize student learning outcomes, especially in aggregate, but tend to fall short when describing the intricacies of an individual student’s conceptual understanding of a subject. The personal attention of an experienced qualitative researcher is still the unrivaled standard for a comprehensive assessment of an individual, but it comes at an often prohibitive expense.

The goal of this research is to automatically generate intermediary evidence of student understanding in order to supplement the non-prescriptive process of qualitatively analyzing student learning. Recent advances in the field of Natural Language Processing have greatly increased the practicality of using computer software to extract meaning from human language. These tools are especially good at identifying linguistic patterns in the way a person structures their communication. This allows us to programmatically parse and analyze student interview transcripts to identify the frequency and variation of linguistic artifacts. These artifacts can illuminate how the student interviewee’s use of language changes when describing concepts they have a deep understanding of versus concepts with which they have little expertise. By performing a statistical analysis of those linguistic properties, a collection of at-a-glance data can be generated to aid a qualitative researcher’s assessment.

This work is largely grounded in the aspect of Cognitive Load Theory that connects elements of learner expertise and transferability of knowledge structures to measurable performance ratings and physiological effects. Notable examples include EEG monitoring, eye tracking, and recording linguistic effects, all of which demonstrate significant interdependence. Variation of those performance and physiological effects have been shown to reflect changes in mental effort and working memory usage, which can be used as partial evidence for a student’s faculty with a concept. Developing a methodology that focuses on examining linguistic effects holds great practical promise because of its very low dissemination and implementation costs. This investigation blends software-generated statistical indicators of student understanding with a more traditional process of qualitative analysis. Under this approach, the researcher’s role of discovering what is being communicated is aided by the natural language software’s extraction of how a student communicates it.

As with any new implementation of cross-disciplinary techniques, assessing the validity of this approach will be foundational. We currently see a correlation between some linguistic artifacts and student conceptual understanding, and would like to present out preliminary findings to the research community. The implications of further development could be greater repeatability and inter-rater reliability of qualitative data analysis, increasing the confidence of findings, and reducing the amount of time needed for a beginning researcher to gain proficiency. Ultimately, quick and reliable assessments of student conceptual understanding have the potential to dramatically change engineering education by encouraging effective formative feedback, assessing pedagogies and curricular materials, or by increasing the degree to which academic assessments reflect student knowledge and abilities.

Arbogast, C. A., & Montfort, D. (2016, June), Applying Natural Language Processing Techniques to an Assessment of Student Conceptual Understanding Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.26262

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015