Asee peer logo

Board 196: A Framework to Assess Debugging Skills for Computational Thinking in Science and Engineering

Download Paper |

Conference

2023 ASEE Annual Conference & Exposition

Location

Baltimore , Maryland

Publication Date

June 25, 2023

Start Date

June 25, 2023

End Date

June 28, 2023

Conference Session

NSF Grantees Poster Session

Tagged Topic

NSF Grantees Poster Session

Page Count

9

DOI

10.18260/1-2--42591

Permanent URL

https://peer.asee.org/42591

Download Count

154

Request a correction

Paper Authors

author page

Derrick Hylton Spelman College

biography

Shannon Hsianghan-huang Sung Institute for Future Intelligence

visit author page

Shannon H. Sung is a Learning Scientist at Institute for Future Intelligence. Her research focuses on technology-enhanced learning and assessment, interdisciplinary STEM learning, and the cognitive learning processes.

visit author page

author page

Xiaotong Ding

author page

Mary Johanna Van Vleet

Download Paper |

Abstract

A rubric is presented to assess debugging skills for students particularly in the natural sciences and engineering. The three categories that are assessed for the cognitive processes in debugging skills are identification, isolation, and iteration. These are defined, and the characteristics of each process are listed. We discuss the method used to develop this rubric that was based on intentional errors in a programming assignment given to students in an introductory physics course. The programming in this assignment was in Python and a visual-based programming platform, called iFlow. We believe that visual-based programming will help elicit weaknesses in debugging because it removes students' familiarity with particular programming languages.

Our focus on debugging skills came from a survey of students to self-identify barriers in computational work in an introductory physics course that included engineering majors. This skill was the primary self-identified barrier along with abstraction skills, which will be the focus of another work. We also present the results of this survey. The Python assignment (ntext = 9) was used to create the rubric and the iFlow assignment (ngraphic = 11) was used to test the rubric. Scoring was based on a scale of six levels in each category. Although the sample size was too small to establish rigorous scoring reliability, we discussed how the two researchers attained agreement in scoring the assignments after iterative modifications of the rubric and rescoring. For the Python assignment, the average for identification was 2.75/5, for isolation 2.30/5, and for iteration 3.33/5. For the iFlow assignment, the average for identification was 2.63/5, for isolation 2.23/5, and for iterate 3.32/5. A consistent trend from these assignments showed that students' approach to debugging is mainly to identify and iterate without a full understanding of the error (i.e., isolation). The lack of a full understanding of the error implies that students are prone to repeat the error. Thus, the important outcome of debugging is to understand the source of error by systematically investigating different parts of the computational solution. Our preliminary results led to the hypothesis that students with weak debugging skills are mainly due the isolation process. This hypothesis will be tested in a future experiment. Results from such an experiment will be significant to those who are designing intervention strategies to integrate computational thinking in science and engineering curricula.

Hylton, D., & Sung, S. H., & Ding, X., & Van Vleet, M. J. (2023, June), Board 196: A Framework to Assess Debugging Skills for Computational Thinking in Science and Engineering Paper presented at 2023 ASEE Annual Conference & Exposition, Baltimore , Maryland. 10.18260/1-2--42591

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2023 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015