Asee peer logo

World Class Outcomes Assessment On A Shoestring

Download Paper |

Conference

2008 Annual Conference & Exposition

Location

Pittsburgh, Pennsylvania

Publication Date

June 22, 2008

Start Date

June 22, 2008

End Date

June 25, 2008

ISSN

2153-5965

Conference Session

SE Curriculum and Course Management

Tagged Division

Software Engineering Constituent Committee

Page Count

10

Page Numbers

13.1411.1 - 13.1411.10

Permanent URL

https://peer.asee.org/3417

Download Count

27

Request a correction

Paper Authors

biography

Joseph Clifton University of Wisconsin-Platteville

visit author page

Joseph M. Clifton is a Professor in the Department of Computer Science and Software Engineering at the University of Wisconsin – Platteville. He has a Ph.D. from Iowa State University. His interests include software engineering, real-time embedded systems, and software engineering education.

visit author page

biography

Rob Hasker University of Wisconsin-Platteville

visit author page

Robert W. Hasker is a Professor in the Department of Computer Science and
Software Engineering at the University of Wisconsin-Platteville. He has a
Ph.D. from the University of Illinois at Urbana-Champaign. His interests include software engineering education, programming languages for
introductory courses, and formal specifications.

visit author page

biography

Mike Rowe University of Wisconsin-Platteville

visit author page

Michael C. Rowe is an Associate Professor in the Department of Computer Science and Software Engineering at the University of Wisconsin - Platteville. He has a Ph.D. from the University of North Texas. His interests include software engineering, software quality assurance techniques, student projects, and software engineering education.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

World-Class Outcomes Assessment on a Shoestring

Abstract

In the fall of 2004, the software engineering faculty at the University of Wisconsin - Platteville developed a set of outcomes and assessment procedures using the principles presented at an ABET Faculty 2.0 Workshop. In addition, we have taken extra steps to improve the quality of the assessment process and reduce the effort required. The result is a tightly specified assessment process that allows us to achieve quality assessment results at a reasonable expenditure of faculty time and effort.

Introduction

Criterion three of the ABET engineering accreditation process states that program outcomes must be assessed with evidence that the results of this assessment process are applied to the further development of the program.3 Anecdotally, many who go through the ABET accreditation process view this criterion as the most problematic. Moreover, satisfying this criterion usually requires significant ongoing efforts.

In the fall of 2004, the software engineering faculty at the University of Wisconsin - Platteville developed a set of outcomes and assessment procedures using the principles presented at an ABET Faculty 2.0 Workshop.4 These outcomes assessment procedures share a number of common practices delineated by other authors who have chronicled their experiences with ABET outcomes assessment.2,4,6 Our particular instantiation has:

• Seven program outcomes with two to five performance criteria established for each outcome. • Two to four measurements for each performance criterion, with at least one direct and one indirect measurement for each. The direct measurements consist of in-course assessments and direct observation. The indirect measurements consist of course surveys and graduating senior exit surveys. • A fixed set of rubrics for each of the measurements • Semester assessment reports summarizing assessment data, identifying problem areas, suggesting improvements, noting where changes due to assessment lead to improvements, and suggesting changes to the assessment process

In addition to the above, we have taken a few extra steps to improve the quality of the assessment process and reduce the effort required. We have specified how each measurement will be performed. This specification helps ensure that a given measurement is performed reliably regardless of the semester or instructor. We have normalized each rubric to produce a numerical result from one to five for each student. We have created automated trigger values based on three sets of criteria that flag those measurements that are below desired levels and therefore require further analysis. These steps go beyond those discussed by other authors. 2,4,6

Clifton, J., & Hasker, R., & Rowe, M. (2008, June), World Class Outcomes Assessment On A Shoestring Paper presented at 2008 Annual Conference & Exposition, Pittsburgh, Pennsylvania. https://peer.asee.org/3417

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2008 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015