Asee peer logo

Cognitive Skills Development Among Undergraduate Engineering Students

Download Paper |


2020 ASEE Virtual Annual Conference Content Access


Virtual On line

Publication Date

June 22, 2020

Start Date

June 22, 2020

End Date

June 26, 2021

Conference Session

Cognitive Skills Development

Tagged Division

Educational Research and Methods

Page Count




Permanent URL

Download Count


Request a correction

Paper Authors


Hannah Smith Queen's University

visit author page

Hannah Smith is an educational researcher, supporting projects in cognitive skills assessment and professional skills development in engineering. Hannah completed a Master's degree in Engineering Education, investigating engineering students' creative confidence and internal motivation for creativity.

visit author page


Brian M. Frank Queen's University

visit author page

Brian Frank is the DuPont Canada Chair in Engineering Education Research and Development, and the Associate Dean (Teaching and Learning) in the Faculty of Engineering and Applied Science at Queen's University where he works on engineering curriculum development, program assessment, and developing educational technology. He is also a Professor in Electrical and Computer Engineering.

visit author page

Download Paper |


Overview: This research paper presents result from a study of engineering students’ cognitive skill development, as part of a large-scale study in Canada. It investigates observable differences between incoming and graduating engineering students’ literacy and numeracy in comparison with the institution- and nation-wide sample. ——————— Higher education institutions face pressure to improve graduate abilities in cognitive and behavioural skills necessary for professional success [1], [2], echoed in engineering by requirements from accrediting bodies [3], [4]. Fundamental skills in numeracy and literacy are core to problem solving and critical thinking, but consistent measurement of these cognitive skills is challenging. The Essential Adult Skills Initiative (EASI) was a large-scale research project involving 20 Canadian post-secondary institutions designed to measure the literacy, numeracy, and problem-solving skills of incoming and graduating college and university students using an internationally benchmarked test, the Education and Skills Online Assessment (ESO) [5]. This research paper presents result from a study of engineering students’ cognitive skill development, as part of a large-scale study in Canada. It investigates observable differences between incoming and graduating engineering students’ literacy and numeracy in comparison with the institution- and nation-wide sample. The ESO was administered in a cross-sectional approach to 1040 first year and 1107 fourth year university students in Canada, including 47 first year and 65 fourth year engineering students at the institution in question. Participation was voluntary, and encouraged by a completion grade in first year, and a gift card incentive of $20 in fourth year. Time spent on test was found to be a significant predictor of final score, reinforcing the importance of student motivation and adequate time spent in obtaining reliable results in a low-stakes testing situation [6]. To combat this, data was filtered to eliminate students who completed the literacy and numeracy test components in less than 1/3 of the recommended time [7]. Results suggest that undergraduate engineering students make significant gains from first to fourth year in numeracy, but not in literacy. Final year engineering students from the engineering sample performed slightly better in literacy and numeracy than the full institutional sample, however 38% of fourth year engineering participants scored at a level suggesting they would have trouble consistently analysing complex texts to evaluate information and make decisions. The results suggest that most graduating students present average skillsets, with too few instances of the superior baseline skills desired for the engineering profession. These results are based on a small sample, and should be treated with caution, but align with similar trends from other tests of cognitive skills among the same population [8], [9]. Two recommendations arise from this work. Firstly, there are concerns about the baseline skill level of some graduating engineering undergraduates. Secondly, low-stakes standardized tests are subject to issues of student motivation and significant correlation of performance with both time spent and self-reported effort [6], suggesting value in developing domain-specific authentic assessment [10]. [1]​National Association of Colleges and Employers, “Career Readiness Competencies: Employer Survey Results,” 2014. [Online]. Available: survey skills. [Accessed: 07-Aug-2019]. [2]​H. J. Passow and C. H. Passow, “What competencies should undergraduate engineering programs emphasize? A systematic review,” J. Eng. Educ., vol. 106, no. 3, pp. 475–526, Jul. 2017. [3]​International Engineering Alliance, “Celebrating international engineering education standards and recognition,” Washington, 2014. [4]​L. J. Shuman, M. Besterfield-Sacre, and J. McGourty, “The ABET ‘professional skills’ - Can they be taught? Can they be assessed?,” in Journal of Engineering Education, 2005, vol. 94, no. 1, pp. 41–55. [5]​H. P. Weingarten, S. Brumwell, K. Chatoor, and L. Hudak, “Measuring Essential Skills of Postsecondary Students : Final Report of the Essential Adult Skills Initiative Ken Chatoor and Lauren Hudak Higher Education Quality Council of Ontario The Higher Education Quality Council of Ontario,” Toronto, Ontario, 2018. [6]​N. Simper, B. Frank, J. Kaupp, N. Mulligan, and J. Scott, “Comparison of standardized assessment methods: logistics, costs, incentives and use of data,” Assess. Eval. High. Educ., vol. 44, no. 6, pp. 821–834, Aug. 2019. [7]​Y. Attali, “Effort in low-stakes assessments: What does it take to perform as well in a high stakes setting?,” Educ. Psychol. Meas., vol. 76, no. 6, pp. 1045–1058, 2016. [8]​N. Simper, B. Frank, J. Scott, and J. Kaupp, “Learning outcomes assessment and program improvement at Queen’s University,” Toronto, Ontario, 2018. [9]​B. M. Frank, N. Simper, and J. A. Kaupp, “How we know they’re learning: Comparing approaches to longitudinal assessment of transferable learning outcomes,” in ASEE Annual Conference & Exposition, 2016. [10]​J. D. Hathcoat, J. D. Penn, L. L. B. Barnes, and J. C. Comer, “A Second Dystopia in Education: Validity Issues in Authentic Assessment Practices,” Res. High. Educ., vol. 57, no. 7, pp. 892–912, Nov. 2016.

Smith, H., & Frank, B. M. (2020, June), Cognitive Skills Development Among Undergraduate Engineering Students Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--34297

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015