June 23, 2013
June 23, 2013
June 26, 2013
Computing & Information Technology
23.549.1 - 23.549.14
Evaluating an e-Content system for Data Structures and Algorithms CoursesWe seek to fundamentally improve instruction in Data Structures and Algorithms (DSA) courses,which play a central role in Computer Science curricula. Students often find this materialdifficult to comprehend because so much of the content is about dynamic processes, such as thebehavior of algorithms and their effects over time on data structures. One of the difficulties thatstudents encounter is lack of feedback regarding whether they understand the material. A typicalDSA course will offer a relatively small number of homework problems and test problems,whose results come only long after the student gives an answer.This study evaluates OpenDSA, an e-content system for presenting materials related to DSA.OpenDSA combines textbook-quality content with algorithm visualizations for every algorithmthat is presented, and a rich collection of interactive activities and exercises. The exercisesprovide immediate feedback to students using automated assessment, and feedback to theinstructor on student progress through the material. We hypothesize that answering manyquestions and exercises and getting immediate feedback on performance will allow students (andtheir instructors) to know whether they are on track with their learning. The study (to beconducted during October 2012) compares the impact of the self-paced tutorials on learning datastructure and algorithms in a quasi-experimental setting using a control and a treatment group.The control group will receive standard lecture and textbook for three weeks similar to what hasbeen typically done in this course. The treatment section will spend their class time workingthrough equivalent content in the form of online material. A test is administered after these threeweeks to compare the results of tutorials and interactive session in class with standard lecturecondition. We will compare the mean test grades for the two groups. We will also administer anopinion survey to examine students’ perception and opinions about these tutorials as opposed toa traditional lecture environment.Pedagogical assessment of deployment for new courseware in courses is crucial to success. Theclassical approach to design and delivery of courses relegates assessment of teaching/learningobjectives to the end. We will use a parallel model of development by integrating the assessmentdesign into the iterative phases of the design of the contents, tutorials, and proficiency exercises.We envision this study as only the first major round of formative evaluation for the system. Thisstudy provides a systematic approach to integrating course objectives with teaching strategies,pedagogies, and best practices. The results will allow us to conduct fine-grained tuning of thematerials in terms of things like making sure that they take the right amount of time to completeand adequately cover the material. The results will also guide fundamental decisions about howto structure a course around a full semester’s worth of online materials.
Hall, S., & Shaffer, C. A., & Fouh, E., & ElShehaly, M. H., & Breakiron, D. (2013, June), Evaluating Online Tutorials for Data Structures and Algorithms Courses Paper presented at 2013 ASEE Annual Conference & Exposition, Atlanta, Georgia. https://peer.asee.org/19563
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015