Asee peer logo

Program Improvements Resulting From Completion Of One Abet 2000 Assessment Cycle

Download Paper |

Conference

2003 Annual Conference

Location

Nashville, Tennessee

Publication Date

June 22, 2003

Start Date

June 22, 2003

End Date

June 25, 2003

ISSN

2153-5965

Conference Session

Advisory Boards & Program Assessment

Page Count

6

Page Numbers

8.946.1 - 8.946.6

DOI

10.18260/1-2--11754

Permanent URL

https://peer.asee.org/11754

Download Count

388

Paper Authors

author page

Sindee Simon

author page

Theodore Wiesner

author page

Lloyd Heinze

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Session 3413

Program Improvements Resulting from Completion of One ABET 2000 Assessment Cycle

S. L. Simon,1 T. F. Wiesner,1 and L. R. Heinze2 1 Dept. of Chemical Engineering, Texas Tech University 2 Dept. of Petroleum Engineering, Texas Tech University

Introduction

With the advent of ABET 2000, self-assessment of engineering programs has become important. To this end, it is essential to define the assessment methods and metrics against which a program will be judged. Various assessment tools exist, ranging from standardized tests to performance- based assessment to more subjective instruments, such as student surveys of their learning and/or knowledge. No assessment tool is ideal. For example, standardized exams have been criticized due to concerns of reduced instructor autonomy1 and alteration of curriculum goals (teaching to the test);2 in addition, the results of standardized tests may be influenced by student motivation.3 On the other hand, student self-evaluations of learning and/or knowledge are subjective. It has been argued by several researchers that combining different types of assessment tools results in a more successful assessment.4,5

Equally important, perhaps, the assessment data obtained must be analyzed and presented in an efficient manner to facilitate identification of program problems and implementation of improvements.6 In this paper, we present the assessment and metrics used in the Departments of Chemical Engineering and Petroleum Engineering at Texas Tech University and the program improvements that have resulted from the completion of the assessment cycle.

Assessment and Metrics

The metrics used to evaluate a program should be directly tied to that program's objectives, and the clearer that these are defined, the easier it is to develop appropriate metrics. As an example, there are three program objectives for the Department of Chemical Engineering at Texas Tech University:

Program Objective 1: Provide students with a high quality education that will enable them to adapt to a rapidly changing technical environment.

Proceedings of the 2003 American Society for Engineering Education Annual Conference & Exposition Copyright © 2003, American Society for Engineering Education

Simon, S., & Wiesner, T., & Heinze, L. (2003, June), Program Improvements Resulting From Completion Of One Abet 2000 Assessment Cycle Paper presented at 2003 Annual Conference, Nashville, Tennessee. 10.18260/1-2--11754

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2003 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015