Wentworth Institute of Technology, Massachusetts
April 22, 2022
April 22, 2022
April 23, 2022
9
10.18260/1-2--42197
https://peer.asee.org/42197
281
Dr. Marisha Rawlins is an Assistant Professor in the Electrical and Computer Engineering Program at Wentworth Institute of Technology (WIT). Her research interests include computer architecture optimizations, embedded systems and devices used in teaching and healthcare, and methods and systems for improving teaching and learning. Dr. Rawlins received her PhD in Electrical and Computer Engineering from The University of Florida. Prior to working at WIT, she was an Assistant Professor in Computer Engineering, and the Discipline Coordinator for the BASc in Computer Engineering and the MSc in Information and Communication Technology Programmes, at The University of Trinidad and Tobago.
This paper investigates different methodologies for implementing competency-based grading (CBG) in programming courses. In competency-based grading (CBG) students focus on mastering the individual topics, therefore, students must show their level of competence in the various topics or sub-topics by the end of the semester. For example, a topic User-defined Functions is graded on a scale 0 No Submission, 1 Beginner, 2 Competent, and 3 Proficient in one of the courses used in this study. A characteristic of CBG is flexibility; to accomplish flexibility these courses combined a flipped-classroom and unlimited resubmissions so that students had the freedom to move through the entire course at their own pace, and so that students can learn from their feedback. One disadvantage of this approach was the burden of grading on the instructor; with a large number of submissions to grade, it can be difficult to give students feedback quickly. In this study we started with the initial implementation of competency-based grading in two Junior-level programming courses in the Electrical Engineering and Computer Engineering programs in 2020, this initial version included unlimited resubmissions and the flipped-classroom. This study also tested the following refinements to the initial CBG approach (1) only one resubmission per assessment and (2) unlimited resubmissions with a live-coding lab demonstration. We compared the performance - overall grades and numbers of students receiving grade D, F, or Withdraw - for the initial implementation and our two refinements. In the case of allowing only one resubmission per assessment, we observed no overall change in performance. This showed that moving from unlimited to one resubmission does not affect performance while still allowing some flexibility and the opportunity for students to master the topic after receiving feedback. Additionally, the instructors’ burden is reduced with fewer submissions to grade. In the case of adding in the live-coding lab demonstration, instructors were able to quickly grade the lab exercise and give feedback directly to the students in real-time, decreasing the time taken to grade the lab exercises. Students also perform better when they have a live-coding demo or exam since they know that they will have to answer questions on their code in real-time. In future work we plan to combine live-coding demonstrations with limited resubmissions.
Rawlins, M., & Junsangsri, P. (2022, April), Refining Competency-Based Grading in Undergraduate Programming Courses Paper presented at ASEE-NE 2022, Wentworth Institute of Technology, Massachusetts. 10.18260/1-2--42197
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2022 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015