Salt Lake City, Utah
June 23, 2018
June 23, 2018
July 27, 2018
Summative testing not only holds students accountable for their learning progress but also intrinsically improves learning, a phenomenon termed “the testing effect”. In addition to traditional homework and tests, many course designs also include formative assessments that are designed primarily to provide practice with authentic skills or knowledge and to provide feedback that students and instructors use to determine next steps in the learning process. Recently we demonstrated in a physiology course for sophomore-level biomedical engineering students that a blended learning environment with well-designed formative assessments can perform as well as a traditional classroom with summative assessments in eliciting a testing effect. This result led us to ask whether the blended learning environment improves other aspects of the classroom such as learner satisfaction or quality of student-faculty interactions. We compared end-of-course survey results from two sections of a physiology course for sophomore-level biomedical engineering students. One section (Control, Testing Effect, n = 86 respondants) was taught using traditional lectures and was assessed using weekly quizzes and quarterly tests. Students received the higher of the average quiz or test grade. The second section (Intervention, Blended Learning, n = 39 respondants) was taught using a variety of active learning and low-stakes practice activities in addition to quarterly tests. Students received a grade computed from a weighted average of these activities. Previously we demonstrated that learning gains were not significantly different between the two course sections. The end-of-course survey contained both Likert-type and open-ended questions. Questions addressed the contributions of class activities to the learning environment. Student responses were significantly more positive (p < 0.05, t-test; medium effect sizes, Hedge’s g) regarding the overall learning environment and in-class lectures in the traditional section. In contrast, student responses were significantly more positive in the blended learning section regarding the textbook (medium effect size) and tests (small to medium effect size). Helpfulness of in-class activities, helpfulness of weekly quizzes, encouragement to explore outside resources, ability to relate the course to broader impacts, and quality of student-faculty interactions were not significantly different between the two sections. Students were also asked to identify specific activities and examples that informed their scores in the Likert-type questions; analysis of these data is nearly complete. Overall, students in the traditional section appeared to be more satisfied with their learning experience in the course. Students in the blended learning section responded more positively to not only in-class learning activities but also the summative assessments, suggesting that these students became more cognizant of the role of testing in the learning and feedback cycle. These differences in students’ perceptions may reflect aspects of student background and preparation, goal- or task-oriented motivation, or maturity of thinking. Whether the blended learning environment really led to a shift in thinking associated with lifelong learning remains to be examined.
Helmke, B. P., & Guilford, W. H. (2018, June), Learner Satisfaction and Quality of Student-Faculty Interactions in Traditional vs. Blended Classrooms Paper presented at 2018 ASEE Annual Conference & Exposition , Salt Lake City, Utah. https://peer.asee.org/30751
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2018 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015