Asee peer logo

An Analysis Of Multi Year Student Questionnaire Data From A Software Engineering Course

Download Paper |

Conference

2007 Annual Conference & Exposition

Location

Honolulu, Hawaii

Publication Date

June 24, 2007

Start Date

June 24, 2007

End Date

June 27, 2007

ISSN

2153-5965

Conference Session

Software Engineering Topics

Tagged Division

Software Engineering Constituent Committee

Page Count

19

Page Numbers

12.198.1 - 12.198.19

DOI

10.18260/1-2--2991

Permanent URL

https://peer.asee.org/2991

Download Count

812

Paper Authors

biography

Valentin Razmov University of Washington

visit author page

Valentin Razmov is an avid teacher, interested in methods to assess and improve the effectiveness of teaching and learning. He is a Ph.D. candidate in Computer Science and Engineering at the University of Washington (Seattle), expected to graduate in 2007. Valentin received his M.Sc. in Computer Science from UW in 2001 and, prior to that, a B.Sc. with honors in Computer Science from Sofia University (Bulgaria) in 1998.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

An Analysis of Multi-Year Student Questionnaire Data from a Software Engineering Course 1. Introduction

Improving student learning has been a long-standing goal of educators across all disciplines. To improve effectively and methodically, one needs to know what works well (and needs to be sustained) and what does not work well (and may benefit from changing). In a classroom environment, the two direct stakeholders, instructors and students, can both provide valuable perspectives on how things are going.

This paper presents an analysis of an extensive set of feedback data provided by students across 8 academic terms for an undergraduate introductory course in software engineering, taught at a large US public university. The feedback was gathered via end-of-term course-specific questionnaires, separate from and much more detailed than the typical university-sponsored course evaluations. In total, 162 students gave feedback, while 5 different instructors were involved with the course, one of whom – the author of this paper – was actively engaged in all 8 offerings.

To give the reader a sense of scale, the end-of-term student questionnaires featured 60-150 questions – mostly multiple choice questions, as well as some free-form short-answer questions. The subject of the questions were the course structure, the instructors’ teaching approach, class sessions, readings, writing assignments, project experiences, tools, the feedback that students received from instructors and peers, as well as questions aimed at capturing student perceptions of what had worked well and what had not.

Among the encouraging results are that students almost unanimously report feeling better prepared for industry careers after taking the course. They also increasingly come out with a heightened appreciation for the value of incremental project development and of many of the “softer” (non-technical, human) issues in engineering. In contrast, the main aspects that our analysis identifies as needing further improvement are the choice of course readings, as well as a stronger emphasis on quality assurance practices and techniques for dealing with ambiguity – both aspects that students tend to find unfamiliar and unnatural. We also share a few surprises found in the data.

Our main contributions are the analysis of the rich body of collected data, as well as distilling groups of questions that have yielded particularly useful results, and categorizing those by target outcome: questions for evolving the course, for “reading” students’ moods, and for getting students to reflect on their experiences. Many of these questions may be broadly applicable.

The remainder of the paper is structured as follows. Section 2 elaborates on relevant aspects of the course structure and describes our mechanism for collecting feedback data. Section 3 discusses what we have learned from our data analysis – first about the course, and then about the process of doing student surveys. We conclude in Section 4. To give the reader a concrete view into the nature of our questionnaires, the Appendix contains the full list of questions from the most recent end-of-term student questionnaire.

Razmov, V. (2007, June), An Analysis Of Multi Year Student Questionnaire Data From A Software Engineering Course Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--2991

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015