Asee peer logo

Measuring Student Computational Thinking in Engineering and Mathematics: Development and Validation of a Non-programming Assessment

Download Paper |


2020 ASEE Virtual Annual Conference Content Access


Virtual On line

Publication Date

June 22, 2020

Start Date

June 22, 2020

End Date

June 26, 2021

Conference Session

K-12 and Bridge Experiences in Engineering Education

Tagged Division

Educational Research and Methods

Page Count




Permanent URL

Download Count


Request a correction

Paper Authors


Timothy Ryan Duckett University of Toledo Orcid 16x16

visit author page

T. Ryan Duckett is a research associate with Acumen Research and Evaluation, LLC., a program evaluation and grant writing company that specializes in STEM and early childhood education. He is a PhD student in the Research and Measurement department at the University of Toledo.

visit author page


Gale A. Mentzer Acumen Research and Evaluation, LLC

visit author page

Gale A. Mentzer, PhD, the owner and director of Acumen Research and Evaluation, LLC, has been a professional program evaluator since 1998. She holds a PhD in Educational Research and Measurement from The University of Toledo and a Master of Arts in English Literature and Language—a unique combination of specializations that melds quantitative and qualitative methodologies. She and has extensive experience in the evaluation of projects focused on STEM education including evaluations of several multi-million dollar federally funded projects. Previously she taught graduate level courses for the College of Education at The University of Toledo in Statistics, Testing and Grading, Research Design, and Program Evaluation.

visit author page

Download Paper |


With the emphasis on the development of computational thinking (CT) skills comes the challenge of accurately measuring CT. Because of its close association with computer science, CT is often measured using programming tools (such as Scratch, Zoombinis, gaming, or simulation-based situations) on a computer (Shute, Sun, & Asbell-Clarke, 2017). CT skills, however, go well beyond programming and should be measurable as a skill that one can implement in other problem-solving situations (Berland & Wilensky, 2015). The majority of CT measures that do not use technology and programming as the medium for measurement are project-specific, examine attitudes towards CT, use a longitudinal approach by examining a project-based process (Shute, Sun, & Asbell-Clarke), or do not examine the transfer of CT to situations other than computer programming (Bers, et al., 2014). This presentation shares the development and validation of a student CT test that can be completed as an online or paper and pencil survey. While developed as part of an NSF STEM +C project designed to improve mathematics and CT ability and interest through learning how to program self-driving model cars, the CT assessment was created as a generic test of CT based upon mathematics because it was administered to both the intervention and control students in high school mathematics courses. In addition, it was the goal to create an assessment that could be completed in less than 30 minutes yet provide a valid measure of student CT. Our assessment is based upon the framework (which in turn is based upon Wing’s seminal article (2006)): CT requires students to take a complex problem and break it down into a series of small, more manageable problems (decomposition). The smaller problems can be looked at individually, considering how similar problems have been solved previously (pattern recognition) and focusing only on the important details, while ignoring irrelevant information (abstraction). Next, simple steps or rules to solve each smaller problem can be designed (algorithms). The assessment has a total of 15 items. The first eight items are multiple choice asking students about preferred problem-solving process. The remaining seven items are open-ended and ask students to elaborate on the steps they would take to solve problems like finding the fastest route from a bus stop to the library (road map is included) and finding the area of an irregular polygon (students are asked to list the steps or process, not solve the problems). This presentation examines the reliability and validity of the test and explores whether there is a change in CT skills between pre and post participation. It also explores whether there are differences not only between the intervention and control groups but looks at student demographics like age, gender, race, and education level.

Berland, M. & Wilensky, U. J Sci Educ Technol (2015) 24: 628. Bers,M., Flannery,L., Kazakoff,E., Sullivan,A., 2014.Computational thinking and tinkering: .Exploration of an early childhood robotics curriculum. Computers & Education 72, 145–157. Shute, V. J., Sun, C., & Asbell-Clarke, J. (2017). Demystifying computational thinking. Educational Research Review, 22, 142-158.

Duckett, T. R., & Mentzer, G. A. (2020, June), Measuring Student Computational Thinking in Engineering and Mathematics: Development and Validation of a Non-programming Assessment Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--34963

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015