Tampa, Florida
June 15, 2019
June 15, 2019
October 19, 2019
NSF Grantees Poster Session
14
10.18260/1-2--32348
https://peer.asee.org/32348
319
Stephen H. Edwards is a Professor and the Associate Department Head for Undergraduate Studies in the Department of Computer Science at Virginia Tech, where he has been teaching since 1996. He received his B.S. in electrical engineering from Caltech, and M.S. and Ph.D. degrees in computer and information science from The Ohio State University. His research
interests include computer science education, software testing, software engineering, and programming languages. He is the project lead for Web-CAT, the most widely used open-source automated grading system in the world. Web-CAT is known for allowing instructors to grade students based on how well they test their own code. In addition, his research group has produced a number of other open-source tools used in classrooms at many other institutions. Currently, he is researching innovative for giving feedback to students as they work on assignments to provide a more welcoming experience for students, recognizing the effort they put in and the accomplishments they make as they work on solutions, rather than simply looking at whether the student has finished what is required. The goals of his research are to strengthen growth mindset beliefs while encouraging deliberate practice, self-checking, and skill improvement as students work.
I am a Ph.D. graduate student in Department of Computer Science in Virginia Tech since Fall, 2013. My research interests is computer science education. Before that, I worked as a research staff in School of Medicine in University of Virginia from 2007 to 2013. I hold a Master degree in Computer Science in Virginia Tech. Master degree in Computer Science and Chemistry in Georgia State University in Atlanta, GA. I obtained my Bachelor degree of Engineering in East China University of Science and Technology in Shanghai, China.
When one first learns to program, feedback on early assignments can easily induce a fixed mindset---where one believes programming is a fixed ability you either have or you don't. However, possessing a fixed mindset perspective has negative consequences for learning. The alternative is to foster a growth mindset, where one believes ability can be improved through practice, effort, and hard work. However, automated grading tools used on programming assignments currently focus on objectively assessing functional correctness and other performance-oriented features of student programs. This encourages students to adopt performance-oriented goals, which are characteristic of a fixed mindset. By building on existing measures of "productive effort", we design a new kind of feedback approach that focuses on recognizing, encouraging, and rewarding diligence and productive actions based on those indicators. The goal is to add such elements to existing feedback in an emotionally supportive way that recognizes the efforts a student expends and values these practices. The feedback design presented here consists of two main components: textual/verbal feedback that recognizes productive effort students spend on a problem, or that encourages students to be strategic about expending effort to improve their own skills. The point of this feedback is to convey to the student that constructive practice to improve one's skills is valued and recognized, independently of the final product they are creating. In addition to the textual feedback, the feedback also includes boosters, or rewards in the form of perks that enhance parts of the student work experience. By taking inspiration from video game psychology and other sources, we designed a booster-based reward system that recognizes hard work without tacitly promoting performance-oriented (score-oriented) motivation. In addition to describing the design of the reward and recognition feedback strategy and the variable ratio reinforcement schedule on which the strategy is based, we also present a post hoc analysis of the results obtained when applying this strategy to existing student submission data. This allows investigating what feedback or boosters would have been earned by individual students in a real-life situation to validate the feedback design before live deployment.
Edwards, S. H., & Li, Z. (2019, June), Board 43: Designing Boosters and Recognition to Promote a Growth Mind-set in Programming Activities Paper presented at 2019 ASEE Annual Conference & Exposition , Tampa, Florida. 10.18260/1-2--32348
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2019 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015