Asee peer logo

Consistency in Assessment of Pre-Engineering Skills

Download Paper |

Conference

2014 ASEE Annual Conference & Exposition

Location

Indianapolis, Indiana

Publication Date

June 15, 2014

Start Date

June 15, 2014

End Date

June 18, 2014

ISSN

2153-5965

Conference Session

FPD 1: The Path to Engineering

Tagged Division

First-Year Programs

Page Count

14

Page Numbers

24.315.1 - 24.315.14

Permanent URL

https://peer.asee.org/20206

Download Count

38

Request a correction

Paper Authors

biography

Shelley Lorimer Grant MacEwan University

visit author page

Dr. Shelley Lorimer, P.Eng. is Chair of the Bachelor of Science in Engineering Transfer Program (BSEN) at Grant MacEwan University in Edmonton, Alberta. She teaches undergraduate courses in statics and dynamics, as well as courses in engineering professionalism. She is currently participating in a research project with Alberta Innovates – Technology Futures in the oil sands and hydrocarbon recovery group doing reservoir simulation of enhanced oil recovery processes. She has a Ph.D. in numerical modeling from the University of Alberta, also in Edmonton.

visit author page

biography

Jeffrey A. Davis Grant MacEwan University

visit author page

With degrees from both Civil and Mechanical Engineering, Jeff went on to obtain a PhD from the Institute of Energy Technology at ETH Zurich in 2004. His past
research includes dispersion of pollutants in rivers, turbulent and multiphase
flow modeling from a numerical perspective. Currently, Jeff is a first year
engineering instructor at MacEwan University. With a passion for teaching, his
focus on research has turned to understanding and automating student
assessment techniques as well as looking at the socio-economic sustainability of educational institutions.

visit author page

Download Paper |

Abstract

Consistency in Assessment of Pre-Engineering SkillsAssessment tools are often used in a predictive way to gauge the overall skills of first-yearengineering students as they begin their engineering education. They are also useful in settinginterventions in terms of tutorials, as well as providing self- improvement motivation for thestudents who achieve scores that are not consistent with earlier high school performance.Previous research has demonstrated that the academic averages obtained in high school, may notnecessarily reflect the skill level (competency) of the students entering first-year, especially inmathematics. However, a longitudinal study over more than ten years has also indicated that theaverages from the math advisory and engineering assessment (Force Concept Inventory) examsdid not show a statistically significant decline during that time period. In this study, both themath and engineering assessment results were further analyzed on a per question basis todetermine whether or not there were any observable trends in the student responses.The results for math assessment exams, taken over thirteen years, indicated that the averageperformance on each question every year is statistically very consistent. The questions that themajority of the students got right each year, and those that the majority got wrong each yearshowed very little variation in the standard deviation (typically < 5%), which was used as themeasure in variability of the mean. The results were further analyzed by categorizing thequestions according to three classifications: algebra, trigonometry and geometry. Typically, thequestions with the best overall performance were simple algebra questions, and the questionswith the worst overall performance involved trigonometric concepts. Moreover, as thecomplexity of the algebra questions increased, the success rate on those questions diminished asexpected. Both assessment exams were time limited and students were not allowed to usecalculators. In the high school curriculum in our region, students use calculators regularly in theirhigh school math courses. As a result, their inherent competency in trigonometric functions islacking, as the average scores (typically less than 30%) on these questions would indicate.Engineering assessment (Force Concept Inventory) exam results collected over a slightly shorterduration (six years) were also analyzed. The same trends in student responses were observed, butin this case the results were somewhat less striking than the results obtained from the mathassessment. It is clear, however, that there is a consistency on the success rate for individualexam questions that test both math and engineering concepts. These results support the anecdotalcontention that students collectively have competency in certain areas (algebra) but lackcompetency in others (trigonometry). It further demonstrates that students often come into first-year engineering with common misconceptions and common math deficiencies.The results from this study are useful from several perspectives. They can provide a focus forinterventions that might address both competency and misconceptions. Secondly, the consistencyand repeatability of this data may provide an impetus to work with K-12 educators to addressthese issues before the students reach university. The consistency of this data also implies thatpre-engineering skills are somewhat predictable from year to year.

Lorimer, S., & Davis, J. A. (2014, June), Consistency in Assessment of Pre-Engineering Skills Paper presented at 2014 ASEE Annual Conference & Exposition, Indianapolis, Indiana. https://peer.asee.org/20206

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2014 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015