June 26, 2011
June 26, 2011
June 29, 2011
Educational Research and Methods
22.677.1 - 22.677.8
Experience with the Transition from Paper to On-Line Course EvaluationsPrior to 2006, the Washington State University (WSU) College of Engineering and Architecture(CEA) evaluated all of its classes using paper surveys that were handed out in class and returnedby one student to the Dean’s office for analysis. In 2006, the CEA changed its procedure to theuse an on-line survey system. This was done for several reasons including previous problemswith breeches of confidentiality and loss of data, future flexibility for separating instructorevaluations in team-taught classes, future ability to change questions easily or to tailor questionsfor individual classes, reduction of costs, ease of offering midterm surveys and assistance from acentral university resource. But, questions were raised by faculty about potential changes in theresponse rate and about how students would respond to open-ended questions.In the semester prior to beginning on-line surveys, the response rate for paper surveys was foundto be approximately 70% due to class absence and refusal to participate. Thus, a 70% responserate was the goal for on-line surveys. Other experience at WSU suggested a 50% response ratefor on-line surveys. This difference created anxiety among some faculty, especially non-tenuredfaculty who depended on the results of these surveys for their annual reviews and tenurepackage. The concern was that, since response was voluntary, only the disgruntled wouldrespond and faculty ratings would decrease simply due to the change in assessment instrument.It was decided that the same questions used in the paper survey would be used for the firstadministrations of the on-line survey so that comparisons could be made between responses onthe paper survey and responses on the on-line survey. In addition, faculty members were giventhe option to conduct paper surveys during the transition period although only approximately15% did so and this number decreased with time. Since the process was voluntary, an extensivee-mail and poster advertising campaign directed at both students and faculty was used. While noextra credit was granted for participation, students who did had the option to enter a lottery forgift certificates at a local bookstore. Finally, in order to increase the number of participants, thesurvey was opened two weeks before the semester ended and was open until just before gradeswere released.The following assessments were made: 1) an analysis of response rate trends by department 2) aquestion by question comparison between paper and on-line responses to questions withnumerical responses over four semesters, and 3) a comparison of the total number and length ofnarrative responses to open ended questions. Initially, the overall response rate did drop to 45%,but then increased to nearly 60%, and has now settled to approximately 55%. A question-by-question comparison of the student numerical responses to identical questions showed noperceptible change despite the reduction in response rate. Finally, no significant change in thelength of responses to open-ended questions was found.
Olsen, R. G., & Kranov, A. A., & Reinkens, K. A. (2011, June), Experience with the College-Wide Transition from Paper to On-line Course Evaluations Paper presented at 2011 ASEE Annual Conference & Exposition, Vancouver, BC. 10.18260/1-2--17958
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2011 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015