Asee peer logo

Using Engineering Design To Assess Program Outcomes

Download Paper |

Conference

2005 Annual Conference

Location

Portland, Oregon

Publication Date

June 12, 2005

Start Date

June 12, 2005

End Date

June 15, 2005

ISSN

2153-5965

Conference Session

BME Research and Design

Page Count

6

Page Numbers

10.1408.1 - 10.1408.6

DOI

10.18260/1-2--15446

Permanent URL

https://peer.asee.org/15446

Download Count

439

Request a correction

Paper Authors

author page

John Gassert

author page

Lisa Milkowski

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Session 2005-1109

Using Rubrics to Evaluate Engineering Design and to Assess Program Outcomes

John D. Gassert, Lisa Milkowski Department of Electrical Engineering and Computer Science Milwaukee School of Engineering 1025 North Broadway Milwaukee, WI 53202-3109 john.gassert@msoe.edu, lisa.milkowski@msoe.edu

Abstract

It has been suggested that all faculty who teach in an engineering program can use rubrics to consistently assess students and simultaneously use that rubric to assess program outcomes for continuous improvement. MSOE is working to develop such rubrics to directly measure student performance and assess outcomes of ABET Criteria Three and Four. One of those rubrics was used to assess student performance in MSOE’s four-year design process. The intent was to give a direct measurement that could be used to assess program outcomes. This paper describes the development and application of a rubric for engineering design and the difficulties encountered with that rubric. While difficulties were encountered, the MSOE biomedical engineering faculty believe rubrics will produce consistent results that can be used to improve its design courses and the curriculum.

Introduction

Although referring to pornography, in 1964, Justice Potter Stewart stated “I know it when I see it.” That is often the belief of faculty members who are assessing student performance. When a faculty member is asked about the quality of a students work, most faculty will say “I know it when I see it;” but to one a symphony to another noise. The biomedical engineering faculty at Milwaukee School of Engineering (MSOE) are working to develop rubrics to directly measure student performance and to simultaneously assess program outcomes for their four-year design course. Their hope is to avoid the “I know it when I see it” argument and finely tune the orchestra.

A process is suggested by Blanchard whereby faculty who teach in an engineering program can use a rubric to consistently assess students and simultaneously use that rubric to assess program outcomes for continuous improvement.[1] The faculty at MSOE plan to apply this approach and to use their assessment results for student performance assessment and for continuous program improvement. Although the rubric presented by Blanchard is applied to a course that has outcomes defined for a single semester course, the MSOE faculty believe that this process could be applicable to MSOE’s four-year design process. It is expected that

Proceedings of the 2005 American Society for Engineering Education Annual Conference & Exposition Copyright © 2005, American Society for Engineering Education

Gassert, J., & Milkowski, L. (2005, June), Using Engineering Design To Assess Program Outcomes Paper presented at 2005 Annual Conference, Portland, Oregon. 10.18260/1-2--15446

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2005 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015