Asee peer logo

A Technique For Program Wide Direct Assessment Of Student Performance

Download Paper |

Conference

2007 Annual Conference & Exposition

Location

Honolulu, Hawaii

Publication Date

June 24, 2007

Start Date

June 24, 2007

End Date

June 27, 2007

ISSN

2153-5965

Conference Session

Direct Measures of Student Performance

Tagged Division

Civil Engineering

Page Count

9

Page Numbers

12.144.1 - 12.144.9

DOI

10.18260/1-2--2714

Permanent URL

https://peer.asee.org/2714

Download Count

466

Paper Authors

biography

Fred Meyer U.S. Military Academy

visit author page

Colonel Karl F. (Fred) Meyer is an Associate Professor and Civil Engineering Program Director in the Department of Civil and Mechanical Engineering at the United States Military Academy (USMA) at West Point, NY. He is a registered Professional Engineer in Virginia. COL Meyer received a B.S. degree from USMA in 1984, and M.S. and Ph.D. degrees in Civil Engineering from the Georgia Institute of Technology in 1993 and 2002.

visit author page

biography

Stephen Bert U. S. Military Academy

visit author page

Major Steve Bert is an instructor in the Department of Civil and Mechanical Engineering at the United States Military Academy. He serves as the Course Director for CE404, Design of Steel Structures and CE492, Senior Capstone Design course. He is a registered Professional Engineer in Virginia. MAJ Bert received a B.S. degree from Norwich University in 1995 and an M.S.C.E. degree from Virginia Tech in 2005.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

A Technique for Program-Wide Direct Assessment of Student Performance

Abstract

This paper builds on previous work related to the direct assessment of student performance. Previous work assessed CE program outcomes using a single senior-level capstone design course. This paper illustrates a systematic approach across the entire CE program for the direct assessment of program outcomes. The civil engineering program outcomes reflect the current ABET 3a-k as well as the ASCE Body of Knowledge (BOK).

The approach integrates existing grading practices and correlates the results with the desired program outcomes. This system of direct assessment provides a quantitative assessment without increasing faculty work load, by leveraging what is already being done in the evaluation and grading of student work. This technique uses embedded indicators, which are specific student performance events common to all students in the course such as homework problems, projects and tests. The program director and course directors identify potential embedded indicators that correlate strongly with the desired program outcomes. In addition to the embedded indicators, non-standard measures of program outcomes such as membership in the ASCE student chapter and performance on the Fundamentals of Engineering Exam are considered.

The greatest benefit of using a well developed system of embedded indicators is to provide a quantitative assessment without increasing faculty workload. The quantitative assessment can then be used to validate an “anecdotal” assessment or identify areas for improvement that may not be readily apparent. This simple yet thorough assessment enables programs to spend time developing improvements or identifying needed resource re-allocation instead of collecting and compiling assessment data.

Introduction

The purpose of this paper is to discuss a program-wide assessment system developed at the United States Military Academy (USMA) and used in the Civil Engineering (CE) program. The ABET requirement to demonstrate a process for program assessment is best approached on a continual basis with annual updates. Within the Department of Civil & Mechanical Engineering at the USMA, course assessments are conducted at the conclusion of each course; in attendance are those instructors involved with teaching the course as well as leadership from the department responsible for overall course and program oversight. During the course assessment meeting, an in-depth analysis of the course is conducted which includes not only administrative items, but a review of the course’s embedded indicators that contribute to the overall program assessment. The embedded indicators from the course are specifically identified by the program director to provide a direct assessment of student learning for a given program outcome. At the program level, the data from each embedded indicator is compiled into an overall spreadsheet broken down by the 16 program outcomes. The process of identifying specific embedded indicators for each course began during Academic Year (AY) 05-06; the results are now being collected. The focus of this paper is to provide an overview of the assessment process and to provide initial

Meyer, F., & Bert, S. (2007, June), A Technique For Program Wide Direct Assessment Of Student Performance Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--2714

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015