Asee peer logo

Frequency Analysis of Terminology on Engineering Examinations

Download Paper |


2011 ASEE Annual Conference & Exposition


Vancouver, BC

Publication Date

June 26, 2011

Start Date

June 26, 2011

End Date

June 29, 2011



Conference Session

Educational Research and Methods Potpourri II

Tagged Division

Educational Research and Methods

Page Count


Page Numbers

22.727.1 - 22.727.14



Permanent URL

Download Count


Request a correction

Paper Authors


Chirag Variawa University of Toronto

visit author page

Chirag Variawa is a Ph.D. candidate in the Department of Mechanical and Industrial Engineering at the University of Toronto. His research interests include maximizing inclusivity, accessibility and usability of engineering education via universal instructional design and innovative instructional methods. He is an active Canadian member of the SCC division of ASEE, co-chair of the Leaders of Tomorrow (Graduate) program and teaching assistant in the Faculty of Applied Science and Engineering. He received his B.A.Sc. (2009) from the Department of Materials Science and Engineering, University of Toronto.

visit author page


Susan McCahan University of Toronto

visit author page

Susan McCahan is a Professor in the Department of Mechanical and Industrial Engineering at the University of Toronto. In addition, she is currently the Chair of First Year for the Faculty of Applied Science and Engineering. She received her B.Sc. from Cornell University (1985), and M.S. (1989) and Ph.D (1992) degrees from Rensselaer Polytechnic Institute in Mechanical Engineering.

visit author page

Download Paper |


Frequency Analysis of Terminology on Engineering ExaminationsThere have always been differences between instructor expectations of what students “shouldknow” and the actual background experience that students have entering an engineeringprogram. The divergence between this assumed knowledge and the actual knowledge base maybe increasing as the student population diversifies. Previous work has noted the impact ofdiversity in this regard. The issue is not just wide differences in preparation in basic math, orscience, or communication ability, but diversity in the cultural background of students. Whilewe frequently laud diversity we have not always followed this up by supporting inclusivity in ourclassrooms and finding ways to bridge cultural differences that may exist. Specifically, when wecontextualize technical material to situate an engineering problem in a real-world scenario,students are subject to a test of their background experience – so, instead of clarifying a technicalconcept, the context may make the concept more inaccessible. This may also compromise theinclusivity of the learning environment, causing students to doubt their suitability for studyingengineering.Current engineering students bring with them a wealth of knowledge that is as diverse as thebackgrounds and cultures they represent. However, students may be unfairly disadvantagedduring examinations, for example, if they are expected to understand terminology that assumes aspecific set of a priori experience; instead of assessing whether the student understands thetechnical material, assessments may inadvertently test vocabulary. This is especially importantfor examinations because the closely-supervised setting prohibits assistance. Yet, pedagogicallywe would prefer to assess understanding of concepts in authentic situations, not in the abstract.And a number of effective methods, such as model-eliciting activities (MEA’s), are based onauthentic contextualization.This represents an instance where learner characteristics are misaligned with the expectations ofthe learning environment, and there has been little research in this particular area of engineeringeducation. The goal of the current study is to evaluate the frequency of this type ofmisalignment. As raw data we are using an exam bank that contains final examinations collectedover a number of years for all engineering courses at a large engineering school. A frequencyanalysis of the words and terms used on the exams has been carried out, excluding coursespecific technical terminology. At this point in the study we are assuming that infrequently usedwords and terms are typically less familiar to students. This is an assumption that will be testedin a subsequent phase of the study.The results of the frequency analysis are analyzed with respect to: 1. The types of words and terms that are used on exams. 2. The types of courses where contextualization appears to be used most often. 3. The types of contexts that appear most frequently.The results will be discussed within the theoretical framework of learner characteristics andinteraction with the learning environment. In particular, we will examine these results withreference to the literature on Universal Instructional Design (UID), and current work on learnerdiversity.

Variawa, C., & McCahan, S. (2011, June), Frequency Analysis of Terminology on Engineering Examinations Paper presented at 2011 ASEE Annual Conference & Exposition, Vancouver, BC. 10.18260/1-2--18008

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2011 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015