Asee peer logo

Using Automatic MATLAB Program Testing for a First­-Year Engineering Computation Course

Download Paper |


2016 ASEE Annual Conference & Exposition


New Orleans, Louisiana

Publication Date

June 26, 2016

Start Date

June 26, 2016

End Date

August 28, 2016





Conference Session

First-Year Programs Division Technical Session 5B: Work-In-Progress: 5 Minute Postcard Session II

Tagged Division

First-Year Programs

Page Count




Permanent URL

Download Count


Request a correction

Paper Authors


Bruce W. Char Drexel University

visit author page

Bruce Char is Professor of Computer Science in the Department of Computing of the College of Computing and Informatics at Drexel University. He is interested in the use of automatic feedback systems in engineering and computer science education.

visit author page


Isuru Daulagala Drexel University

visit author page

Isuru Daulagala received his BS in Electrical and Computer Engineering from Drexel University, Philadelphia PA in 2014. He is currently a graduate student in the Electrical and Computer Engineering Department at Drexel University. His research interests include Physical Design, High Performance Computing and developing and improving tools in Engineering Education. From September 2014 to March 2015, he was an intern at NVIDIA Corporation at Santa Clara, CA.

visit author page


Nagarajan Kandasamy Drexel University

visit author page

Naga Kandasamy is an Associate Professor in the Electrical and Computer Engineering Department at Drexel University where he teaches and conducts research in the area of computer engineering, with specific interests in embedded systems, self-managing systems, reliable and fault-tolerant computing, distributed systems, computer architecture, and testing and verification of digital systems. He received his Ph.D in 2003 from the University of Michigan. Prior to joining Drexel, he was a research scientist at the Institute for Software Integrated Systems, Vanderbilt University, from 2003-2004.

Prof. Kandasamy is a recipient of the 2007 National Science Foundation Early Faculty (CAREER) Award and best student paper awards at the IEEE International Conference on Autonomic Computing in 2006 and 2008, and the IEEE Pacific Rim Dependability Conference in 2012. He is a senior member of the IEEE.

visit author page


Thomas T. Hewett Drexel University

visit author page

Tom Hewett is Professor Emeritus of Psychology and of Computer Science at Drexel University. His teaching included courses on Cognitive Psychology, Problem Solving and Creativity, the Psychology of Human-Computer Interaction, and the Psychology of Interaction Design. In addition, he has taught one-day professional development courses at both national and international conferences, and has participated in post-academic training for software engineers. Tom has worked on the design and development of several software projects and several pieces of commercial courseware. Some research papers have focused on the evaluation of interactive computing systems and the impact of evaluation on design. Other research papers have explored some of the pedagogical and institutional implications of universal student access to personal computers. In addition, he has given invited plenary addresses at international conferences. Tom chaired the ACM SIGCHI Curriculum Development Group which proposed the first nationally recognized curriculum for the study of Human-Computer Interaction. Tom's conference organizing work includes being Co-Chair of the CHI ‘94 Conference on Human Factors in Computing Systems and Program Chair for the 2013 Creativiey and Cognition Conference. In 2014 he was the recipient of the ACM SIGCHI Lifetime Service Award.

visit author page

Download Paper |


This work in progress describes our use of Cody Coursework to provide 24/7 automatic feedback for MATLAB programming practice in homework and in labs, as well as assessment in proctored quizzes given in lab periods, in a class of approximately 1000 first year engineering students currently run in approximately 35 lab sections. One of the most basic principles of instruction is that students get better with practice; the beneficial effects of practice are enhanced by timely feedback. For computer programming, scarce grading and feedback resources can be augmented by automatic testing tools that provide basic feedback on whether student programming is meeting expectations for correct output behavior. Cody Coursework provides a self-service web interface to a cloud-based service that informs students of the results of instructor-provided tests for programs they have submitted. Instructors can obtain summary and detailed reports of class results over the web and as CSV format downloads. The instructor interface allows simple means of providing problems and ways to check correctness and performance for labs, assignments and quizzes. We have developed dozens of Cody exercises as part of normal course development activities in the past year deployed in a conventional course with face-to-face lecture and lab time, a textbook, homework, and exams.

Our evaluation of Cody Coursework as a teaching-learning tool is an ongoing activity that involves two phases, two years, and two types of evaluation, with the first type of evaluation being applied iteratively as needed. The first phase and type of evaluation is formative. As we continue to implement tools, materials, exercises, etc., during this phase we regard the students as being both consultants and co-evaluators of the tools. Gathering student feedback involves using two surveys and the usage analytics provided by Cody Coursework itself. We have constructed and administered a first survey with the goal of understanding more about the levels of experience and types of knowledge students bring to the course. Approximately 350 students completed the survey. We will perform more detailed analysis to clarify how many students bring which types of experience to doing their evaluations and what their expectations are for using computers in the future, both immediate and long range. These evaluations are also expected to assist faculty in calibrating their ongoing choices of topics and exercise difficulty level to better fit the course goals and audience. A second formative questionnaire is focused on students providing feedback on effectiveness and usability of Cody Coursework. In the subsequent second phase, summative evaluation will be conducted through an IRB-reviewed investigation of the effects of Cody Coursework and autograded exercises upon student learning.

Char, B. W., & Daulagala, I., & Kandasamy, N., & Hewett, T. T. (2016, June), Using Automatic MATLAB Program Testing for a First­-Year Engineering Computation Course Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.27131

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015