Tampa, Florida
June 15, 2019
June 15, 2019
June 19, 2019
Computers in Education
15
10.18260/1-2--33083
https://peer.asee.org/33083
895
Nabeel Alzahrani is a Computer Science Ph.D. student in the Department of Computer Science and Engineering at the University of California, Riverside. Nabeel's research interests include causes of student struggle, and debugging methodologies, in introductory computer programming courses.
Frank Vahid is a Professor of Computer Science and Engineering at the Univ. of California, Riverside. His research interests include embedded systems design, and engineering education. He is a co-founder of zyBooks.com.
Alex Edgcomb is Sr. Software Engineer at zyBooks, a startup spun-off from UC Riverside and acquired by Wiley. zyBooks develops interactive, web-native learning materials for STEM courses. Alex actively studies and publishes the efficacy of web-native learning materials on student outcomes.
Previous research reports common student errors in introductory programming (CS1) classes. Knowing common errors enables us to improve teaching and content to train students to avoid those errors, and to provide an automated help system like providing hints based on particular auto-detected errors. Finding and fixing some errors is a part of learning, so our focus is specifically on errors that cause struggle, meaning excessive time or attempts. Struggle may lead to giving up, loss of confidence, or cheating. For 89 online auto-graded coding homework problems in our CS1 class of 100 students (mostly engineering/science majors), we first automatically determined the 12 problems with the highest struggle rates. Then, we spent about 100 hours manually examining incorrect student submissions to determine what errors caused struggle and the time spent on each error. Like previous work, we found many common general errors, like using = rather than ==. However, we also found problem-specific errors, like misusing a particular library function, leading to a first conclusion that a help system should allow teachers/authors to add problem-specific hints. Furthermore, we analyzed errors that caused the longest struggle, and found some uncommon "one-off" errors, leading to a second conclusion that a help system won't be able to detect all errors and thus might need automated recommending or alerting for human assistance (or other techniques).
Alzahrani, N., & Vahid, F., & Edgcomb, A. D. (2019, June), Manual Analysis of Homework Coding Errors for Improved Teaching and Help Paper presented at 2019 ASEE Annual Conference & Exposition , Tampa, Florida. 10.18260/1-2--33083
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2019 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015