Asee peer logo

Using Student Performance And Faculty Experience To Assess A Mechanical Engineering Program

Download Paper |

Conference

2007 Annual Conference & Exposition

Location

Honolulu, Hawaii

Publication Date

June 24, 2007

Start Date

June 24, 2007

End Date

June 27, 2007

ISSN

2153-5965

Conference Session

Meeting ABET Requirements

Tagged Division

Mechanical Engineering

Page Count

10

Page Numbers

12.1565.1 - 12.1565.10

Permanent URL

https://peer.asee.org/2242

Download Count

20

Request a correction

Paper Authors

biography

Bobby Crawford USMA

visit author page

Bobby Crawford is a Lieutenant Colonel in the United States Army and the Director of the Aero-Thermo Group in the Department of Civil and Mechanical Engineering at the United States Military Academy, West Point, NY. He holds a MS and a Ph.D. in Aerospace Engineering and is a licensed Professional Engineer.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Using Student Performance and Faculty Experience to Assess a Mechanical Engineering Program

Abstract

Assessing the level at which a Mechanical Engineering program achieves its stated outcomes is essential, not only to a successful ABET evaluation but also to the continued improvement and effectiveness of the program. While survey data is valuable, it should only be one component of a broader assessment plan. The Mechanical Engineering (ME) program at the United States Military Academy (USMA) has employed a method to feed graded event averages and standard deviations from student assignments, examinations, and projects into a multi-level assessment tool that provides a valuable measure of how well the students are achieving the program outcomes.

In the fall of 2005, the need arose to objectively evaluate how well the students in a design course were achieving USMA’s Engineering and Technology outcomes. The author developed a method to identify the graded events that supported each of the course’s objectives, determine how well they supported those objectives, and then link objective achievement to the USMA level outcomes through a subjective pair-wise comparison of the course objectives. Positive feedback from faculty in the ME program led to expansion of this process to capture the student performance data and faculty input from all ME program courses and feed this into a program level assessment. The resulting evaluation combines the strengths of objective evaluation (based on graded events) and subjective evaluation (based on faculty experience).

This paper describes the motivation for developing the assessment tool, the components of the assessment tool, how each component is integrated to provide an assessment of course objectives, and how these assessments combine to produce an evaluation of program outcomes. Examples of course and program assessment results are presented. Finally, the paper describes how the results of this assessment instrument have been used to modify course objectives and improve course content within the ME program. This tool has been extremely effective and is now a key component of the USMA ME program assessment.

Introduction

According to ABET, Inc., all accredited engineering programs must establish outcomes that will lead to the attainment of the program’s objectives. There must be a documented assessment process in place “that demonstrates that these program outcomes are being measured and indicates the degree to which the outcomes are achieved.”1 This paper describes a process that is currently in use at USMA to incorporate student performance indicators into the assessment of course objectives and program outcomes.

Mechanical Engineering Program Outcomes

In accordance with ABET, Inc. guidance, the ME program leadership began by defining the outcomes (those things that our students should know and be able to do by graduation) for our

Crawford, B. (2007, June), Using Student Performance And Faculty Experience To Assess A Mechanical Engineering Program Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. https://peer.asee.org/2242

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015