Atlanta, Georgia
June 23, 2013
June 23, 2013
June 26, 2013
2153-5965
Computing & Information Technology
14
23.549.1 - 23.549.14
10.18260/1-2--19563
https://peer.asee.org/19563
653
Dr. Simin Hall is a Research Assistant professor in the Department of Mechanical Engineering (ME) at Virginia Tech (VT). Currently she is collaborating with Dr. Cliff Shaffer in Computer Science Department on a National Science Foundation funded TUES project to improve instruction in Data Structures and Algorithms (DSA) courses. Her applied research in education is in cognitive functioning using online learning technologies. She has redesigned two undergraduate courses in Thermodynamics for online/Distance delivery at the ME Department at VT. In 2010, with an education grant from Nuclear Regulatory Commission (NRC) she completed the online design of the Graduate nuclear engineering certificate program. In 2011, the new education grant from NRC, allowed initiating the design of two new nuclear graduate courses for the Master program. She maintains research and publishing tracks in nascent interdisciplinary trust concepts, eLearning, and innovative teaching, learning in fields of statistics and research methods, engineering, medical fields, and assessment methods.
Dr. Shaffer received his PhD in Computer Science from University of Maryland, College Park in 1986. He is currently Professor of Computer Science at Virginia Tech, where he has been since 1987. He directs the AlgoViz and OpenDSA projects, whose goals respectively are to support the use of algorithm visualization in the classroom, and the development of a complete online collection of interactive tutorials for data structures and algorithms courses. His research interests are in Computational Biology and Bioinformatics, Problem Solving Environments, Digital Education, Algorithm Visualization, Hierarchical Data Structures, Algorithm Design and Analysis, and Data Structures.
Evaluating an e-Content system for Data Structures and Algorithms CoursesWe seek to fundamentally improve instruction in Data Structures and Algorithms (DSA) courses,which play a central role in Computer Science curricula. Students often find this materialdifficult to comprehend because so much of the content is about dynamic processes, such as thebehavior of algorithms and their effects over time on data structures. One of the difficulties thatstudents encounter is lack of feedback regarding whether they understand the material. A typicalDSA course will offer a relatively small number of homework problems and test problems,whose results come only long after the student gives an answer.This study evaluates OpenDSA, an e-content system for presenting materials related to DSA.OpenDSA combines textbook-quality content with algorithm visualizations for every algorithmthat is presented, and a rich collection of interactive activities and exercises. The exercisesprovide immediate feedback to students using automated assessment, and feedback to theinstructor on student progress through the material. We hypothesize that answering manyquestions and exercises and getting immediate feedback on performance will allow students (andtheir instructors) to know whether they are on track with their learning. The study (to beconducted during October 2012) compares the impact of the self-paced tutorials on learning datastructure and algorithms in a quasi-experimental setting using a control and a treatment group.The control group will receive standard lecture and textbook for three weeks similar to what hasbeen typically done in this course. The treatment section will spend their class time workingthrough equivalent content in the form of online material. A test is administered after these threeweeks to compare the results of tutorials and interactive session in class with standard lecturecondition. We will compare the mean test grades for the two groups. We will also administer anopinion survey to examine students’ perception and opinions about these tutorials as opposed toa traditional lecture environment.Pedagogical assessment of deployment for new courseware in courses is crucial to success. Theclassical approach to design and delivery of courses relegates assessment of teaching/learningobjectives to the end. We will use a parallel model of development by integrating the assessmentdesign into the iterative phases of the design of the contents, tutorials, and proficiency exercises.We envision this study as only the first major round of formative evaluation for the system. Thisstudy provides a systematic approach to integrating course objectives with teaching strategies,pedagogies, and best practices. The results will allow us to conduct fine-grained tuning of thematerials in terms of things like making sure that they take the right amount of time to completeand adequately cover the material. The results will also guide fundamental decisions about howto structure a course around a full semester’s worth of online materials.
Hall, S., & Shaffer, C. A., & Fouh, E., & ElShehaly, M. H., & Breakiron, D. (2013, June), Evaluating Online Tutorials for Data Structures and Algorithms Courses Paper presented at 2013 ASEE Annual Conference & Exposition, Atlanta, Georgia. 10.18260/1-2--19563
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015