Asee peer logo

Comparison Of Student And Faculty Assessments Of The Effectiveness Of Learning Activities

Download Paper |

Conference

2003 Annual Conference

Location

Nashville, Tennessee

Publication Date

June 22, 2003

Start Date

June 22, 2003

End Date

June 25, 2003

ISSN

2153-5965

Conference Session

ASEE Multimedia Session

Page Count

8

Page Numbers

8.309.1 - 8.309.8

DOI

10.18260/1-2--12173

Permanent URL

https://peer.asee.org/12173

Download Count

411

Request a correction

Paper Authors

author page

Lynn Bellamy

author page

Barry McNeill

author page

Veronica Burrows

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Session 2379

Comparing Student and Faculty Assessments of the Effectiveness of Learning Activities

Veronica A. Burrows, Barry W. McNeill, Lynn Bellamy

Department of Chemical and Materials Engineering / Mechanical and Aerospace Engineering Arizona State University Tempe, AZ 87287

Abstract

A robustly designed course normally comprises a variety of learning activities, each intended to facilitate the achievement of specific learning objectives to a specific depth or level of learning. In other words, faculty usually design the learning activities of their courses with specific learning objectives in mind. With the implementation of outcomes-based assessment, student self- assessment of their own learning and of the effectiveness of the learning activities in their courses is a significant part of the course and program assessment of learning effectiveness.

Students in an introductory engineering class were required at semester’s end to assess the effectiveness of course learning activities (homework, projects, lectures, assigned textbook readings, etc) in supporting their achievements of the course learning objectives. This was accomplished through the use of a matrix that mapped each of the course learning objectives to the course learning activities. Instructional faculty: also assessed the intended impact of the course’s learning activities, as well as their judgment of the actual effectiveness of the learning activities.

Faculty assessments of intended impact fairly closely matched their estimates of actual impact, however, there were significant differences between faculty assessment of effectiveness and student assessments of effectiveness. Detailed results and their implications for using student assessments of the teaching effectiveness of various learning activities will be presented.

Introduction

Student evaluations of faculty teaching effectiveness are a well-established, essentially universal element of post-secondary education1. There are many approaches taken in the design of such evaluations, including both quantitative questions (e.g., “Rate on a scale on 1 to 5 . . .”) and qualitative questions (e.g. “What did you like best. . .”) regarding faculty attitudes and behaviors, and student satisfaction with these. While the major expected outcome of faculty teaching is student learning, surprisingly, aside from questions concerning the textbook, few student

Proceedings of the 2003 American Society for Engineering Education Annul Conference & Exposition Copyright ©2003, American Society for Engineering Education

Bellamy, L., & McNeill, B., & Burrows, V. (2003, June), Comparison Of Student And Faculty Assessments Of The Effectiveness Of Learning Activities Paper presented at 2003 Annual Conference, Nashville, Tennessee. 10.18260/1-2--12173

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2003 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015