Asee peer logo

Improving Scores On Course Evaluations: Experienced Faculty Tell What Works

Download Paper |

Conference

2009 Annual Conference & Exposition

Location

Austin, Texas

Publication Date

June 14, 2009

Start Date

June 14, 2009

End Date

June 17, 2009

ISSN

2153-5965

Conference Session

Getting Started: Objectives, Rubrics, Evaluations, and Assessment

Tagged Division

New Engineering Educators

Page Count

10

Page Numbers

14.708.1 - 14.708.10

DOI

10.18260/1-2--5516

Permanent URL

https://216.185.13.174/5516

Download Count

413

Request a correction

Paper Authors

biography

Edward Gehringer North Carolina State University

visit author page

Ed Gehringer is an associate professor in the Department of Computer Science and the Department of Electrical and Computer Engineering at North Carolina State University. He has been a frequent presenter at education-based workshops in the areas of computer architecture and object-oriented systems. His research interests include architectural support for memory management, garbage collection, and computer-supported collaborative learning. He received a B.S. from the University of Detroit(-Mercy) in 1972, a B.A. from Wayne State University, also in 1972, and the Ph.D. from Purdue University in 1979.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Improving Scores on Course Evaluations: Experienced Faculty Tell What Works

Edward F. Gehringer North Carolina State University efg@ncsu.edu

Abstract

On many campuses, student course evaluations are the primary means of evaluating teaching, and can have an impact on performance reviews, tenure, and promotion. Thus, faculty desire to improve their scores. But how can they? This paper considers the experience of several faculty who managed to raise their scores significantly. Some of them did so by taking steps that would improve their course or their attentiveness to students. Others simply “dumbed down” their courses. There is an extensive literature on factors affecting course-evaluation scores. It confirms some of what our informants told us, and calls into question some of their observations. In particular, we discuss the issue of leniency at length, and conclude with three recommenda- tions to new engineering instructors on how to improve their own scores. The first is to be more attentive to students and their needs. Next, an instructor should focus on what is being learned rather than what is being taught. And finally, faculty should avail themselves of institutional support for improving teaching.

1. Introduction

In most engineering schools, except for research, teaching is the most important factor on which reappointment, promotion, and tenure are based. And teaching is most often evaluated using student course evaluations. This places faculty in a delicate position, a reciprocal relationship between their students and them, in which each party is assessing the other and influencing their subsequent advancement. For this reason, student course evaluation is one of the most contentious issues [1] in all kinds of academic departments and all kinds of institutions. Instructors rightly point out that other factors should be considered when determining the efficacy of teaching. Thus, in recent years, peer evaluation of teaching [2] has taken its place alongside student evaluations in determining teaching competence. But faculty remain uneasy about their student evaluations, regarding them almost fatalistically as something potentially important over which they have little control. The goal of this work is to present the cases of a number of engineering and computer-science faculty who did manage to improve their scores, in hopes that they can serve as role models. We identify several aspects of their teaching where change made a difference. Then we compare their observations to what the published literature reveals. We conclude with recommendations for faculty who want to improve their scores. Our respondents came from two mailing lists, the Engineering Technology listserv, etd- l@listproc.tamu.edu, serving ASEE’s Engineering Technology division, and the SIGCSE members list, SIGCSE-members@LISTSERV.ACM.ORG, serving the Special Interest Group on Computer Science Education of the Association for Computing Machinery. The author posted

Proceedings of the 2009 American Society for Engineering Education Annual Conference & Exposition 1 Copyright 1 2009, American Society for Engineering Education

Gehringer, E. (2009, June), Improving Scores On Course Evaluations: Experienced Faculty Tell What Works Paper presented at 2009 Annual Conference & Exposition, Austin, Texas. 10.18260/1-2--5516

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2009 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015