Asee peer logo

Misunderstandings, Mistakes, and Dishonesty: A Post-hoc Analysis of a Large-scale Plagiarism Case in a First-year Computer Programming Course

Download Paper |

Conference

2020 ASEE Virtual Annual Conference Content Access

Location

Virtual On line

Publication Date

June 22, 2020

Start Date

June 22, 2020

End Date

June 26, 2021

Conference Session

First-year Programs: Focus on Students

Tagged Division

First-Year Programs

Page Count

17

DOI

10.18260/1-2--34977

Permanent URL

https://peer.asee.org/34977

Download Count

451

Request a correction

Paper Authors

biography

Philip Reid Brown Rutgers, The State University of New Jersey

visit author page

Philip Brown is an Assistant Teaching Professor in Undergraduate Education at Rutgers School of Engineering. Philip recently received his PhD from the Department of Engineering Education at Virgnia Tech. His research interests include the use of motivation, cognition and learning theories in engineering education research and practice, and better understanding student perspectives in engineering programs.

visit author page

biography

Ilene J. Rosen Rutgers, The State University of New Jersey

visit author page

lIene Rosen has been an educational administrator serving students in higher education for 35 years. She earned her doctoral degree in educational psychology from Rutgers University Graduate School of Education. Currently the Associate Dean for Student Services at Rutgers, School of Engineering, she also served as the director of several programs including the NJ Educational Opportunity Fund Program at Rutgers School of Engineering, the NJ Governor’s School of Engineering & Technology, and the Northern NJ Junior Sciences Symposium. Rosen has been recognized as the Educator of the Year in Higher Education by the Society of Hispanic Professional Engineers.

visit author page

Download Paper |

Abstract

In this evidence-based practice paper, we discuss the issue of plagiarism in a first-year engineering computer programming course. Plagiarism is an issue that can plague courses where students submit independently created work. Traditionally, plagiarism has been associated with writing assignments, and there are a wide variety of tools and interventions available for both identifying and preventing plagiarism on these assignments. However, although computer programming courses also report a large number of plagiarism cases, there are fewer easy to use or well understood tools and interventions available to instructors of these courses. This paper describes a sequence of plagiarism cases in a large first-year computer programming course for engineers, and how the course was adapted in order to address the prevalence of these cases. One issue particular to computer programming is a lack of consensus on when it is ethical to copy and use without acknowledgement when it comes to computer code. Many programmers openly share code. Being able to find examples of code that can help you write a program can be a valuable and valid skill for a programmer. However, when courses are tasked with teaching and assessing the basic principles of computer programming, there is a dissonance between the free-sharing, open culture often found in programming communities, and the needs of instructors when it comes to determining that students understand those basic principles. Additionally, we often encourage students to work in groups in computer programming courses, which can sometimes lead to confusion about the limits of plagiarism when submitting individual work. Some computer programming courses avoid plagiarism by focusing on testing for assessment. However, the knowledge displayed in test answers is a less authentic representation of computer programming skill than projects that ask students to write and test real computer programs. In updating a first-year computer programming course at a large, public, land-grant, research institution in the Mid-Atlantic United States, one of the co-authors used the above reasoning for the inclusion of several individual projects in the course’s curriculum. This required course serves approximately 1100 engineering students yearly. The computer programming language used in the course is MATLAB. The course is presented in a lecture-recitation format. Lectures of 100-200 students focus on introducing concepts to students, and having students do informal activities with new concepts. Recitations of 30-40 students focus on having students work on hands-on programming activities individually and in groups, while interacting with a team of graduate and undergraduate assistants there to facilitate active learning. In the assessment of individual projects in this course, and through the use of the Stanford Moss plagiarism tool, the co-author of this paper found evidence that a large proportion of students were sharing or copying code on individual project submissions worth approximately 5% of course credit each. In the following semester, efforts were made to clarify expectations of what constituted plagiarism in these assignments. However, rates of suspected plagiarism from the Stanford Moss tool were still high. In processing the plagiarism cases for these projects, the authors identified a number of themes for why students were flagged by Stanford Moss. Some were still confused by the distinction between sharing ideas in a group and submitting the same code as other students. Some were (incorrectly) adamant that there was only one, or a very limited number of ways, that a computer program could be written to successfully complete a desired task. Some students admitted to not reading instructions on what was acceptable collaboration more carefully. Some admitted to plagiarism because of stress, lack of time, or other factors. Finally, some cases were dismissed upon review. The first attempt at intervention to prevent plagiarism, clarifying and repeating instructions, had little-to-no effect on the number of cases in subsequent semesters. The second, ongoing intervention was developed to circumvent plagiarism, while still assessing authentic programming abilities displayed in projects. This involves assessing students’ projects by having them use and modify their project code in a timed, computer-based assessment. While this solution has effectively ended the need to process plagiarism cases, the underlying problems driving plagiarism are still present. Scores on similar project prompts fell drastically between semesters where project code was graded and semesters when students used the computer based assessment. Some students did poorly on computer-based assessments because they lacked understanding of things that are oft-assumed to be rudimentary: using computer programs that they (or others) have written, or making simple edits to code. This hints at a potentially deeper problem with our ability to teach and assess essential programming knowledge in large classes, and may be an underlying reason for the previously-mentioned plagiarism cases.

Brown, P. R., & Rosen, I. J. (2020, June), Misunderstandings, Mistakes, and Dishonesty: A Post-hoc Analysis of a Large-scale Plagiarism Case in a First-year Computer Programming Course Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--34977

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015