New Orleans, Louisiana
June 26, 2016
June 26, 2016
August 28, 2016
This work in progress describes our use of Cody Coursework to provide 24/7 automatic feedback for MATLAB programming practice in homework and in labs, as well as assessment in proctored quizzes given in lab periods, in a class of approximately 1000 first year engineering students currently run in approximately 35 lab sections. One of the most basic principles of instruction is that students get better with practice; the beneficial effects of practice are enhanced by timely feedback. For computer programming, scarce grading and feedback resources can be augmented by automatic testing tools that provide basic feedback on whether student programming is meeting expectations for correct output behavior. Cody Coursework provides a self-service web interface to a cloud-based service that informs students of the results of instructor-provided tests for programs they have submitted. Instructors can obtain summary and detailed reports of class results over the web and as CSV format downloads. The instructor interface allows simple means of providing problems and ways to check correctness and performance for labs, assignments and quizzes. We have developed dozens of Cody exercises as part of normal course development activities in the past year deployed in a conventional course with face-to-face lecture and lab time, a textbook, homework, and exams.
Our evaluation of Cody Coursework as a teaching-learning tool is an ongoing activity that involves two phases, two years, and two types of evaluation, with the first type of evaluation being applied iteratively as needed. The first phase and type of evaluation is formative. As we continue to implement tools, materials, exercises, etc., during this phase we regard the students as being both consultants and co-evaluators of the tools. Gathering student feedback involves using two surveys and the usage analytics provided by Cody Coursework itself. We have constructed and administered a first survey with the goal of understanding more about the levels of experience and types of knowledge students bring to the course. Approximately 350 students completed the survey. We will perform more detailed analysis to clarify how many students bring which types of experience to doing their evaluations and what their expectations are for using computers in the future, both immediate and long range. These evaluations are also expected to assist faculty in calibrating their ongoing choices of topics and exercise difficulty level to better fit the course goals and audience. A second formative questionnaire is focused on students providing feedback on effectiveness and usability of Cody Coursework. In the subsequent second phase, summative evaluation will be conducted through an IRB-reviewed investigation of the effects of Cody Coursework and autograded exercises upon student learning.
Char, B. W., & Daulagala, I., & Kandasamy, N., & Hewett, T. T. (2016, June), Using Automatic MATLAB Program Testing for a First-Year Engineering Computation Course Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.27131
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015