Asee peer logo

Web Based Technology For Long Term Program Assessment

Download Paper |

Conference

2001 Annual Conference

Location

Albuquerque, New Mexico

Publication Date

June 24, 2001

Start Date

June 24, 2001

End Date

June 27, 2001

ISSN

2153-5965

Page Count

6

Page Numbers

6.1147.1 - 6.1147.6

DOI

10.18260/1-2--10015

Permanent URL

https://peer.asee.org/10015

Download Count

511

Paper Authors

author page

Kamyar Haghighi

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Session 2793

Web-Based Technology for Long-Term Program Assessment Heidi Diefes-Dux, Kamyar Haghighi Purdue University, West Lafayette, IN

Abstract

During its first round of assessment plan implementation, the Department of Agricultural and Biological Engineering (ABE) at Purdue University collected data utilizing a variety of assessment tools including ABET-compliant course profiles and constituent surveys. The man- hours involved in the development of program unique assessment tools and data collection and analysis is astounding, especially when a university adopts a decentralized stand on individual program assessment. The question is constantly raised about whether individual engineering departments can afford to devote so much faculty, staff, and student time to the data collection and analysis process. This paper will highlight how ABE has addressed this issue through in- house development of a web-based data collection process.

I. Introduction

In December 1998, the Academics Program Committee (APC), an ABE departmental committee comprised of faculty, staff, and student representatives, was charged to coordinate and lead the departmental ABET effort for a Fall 2001 review. At that time, the only school of engineering at Purdue University to have made any headway in deciphering EC 2000 was Civil Engineering, and the Deans of Engineering placed the development and implementation of assessment plans clearly at the department/school level. This means that each engineering school was charged with creating their own means of data collection, interpretation, and analysis. For large schools with substantial resources, such as Chemical Engineering, it was possible to hire outside consultants to develop assessment tools, collect data, and analyze results. A small department like ABE does not have similar resources. The APC committee, with support of the ABE faculty, had to develop their own assessment tools and, with the staff support, conduct and analyze the results in-house.

II. Program Outcomes - A Common Survey Item

The assessment tools that ABE has developed include surveys of alumni, employers, graduating seniors, faculty, and students. When collecting data, a common set of questions is needed to compare the responses of the different constituents. Program Outcomes (PO), broad descriptions of what a graduate will be expected to know and be able to do after completing an academic program [1], can be used as a basis for the common questions. Performance criteria (PC) are specific and more directly measurable skills and abilities [2]. Under each PO, there are on average 5 PCs for a total of 60 PCs. While POs are generally regarded as not directly measurable, the number of performance criteria that fall under each PO is unwieldy for individual student and alumni evaluation and program modification/change. In an attempt to have an acceptable return rate of completed surveys, ABE kept the length of each survey appropriate for the perceived passion each constituent has for our program. Therefore, ABE elected to have the constituents evaluate the program based on overall achievement of each PO.

“Proceedings of the 2001 American Society for Engineering Education Annual Conference & Exposition Copyright © 2001, American Society for Engineering Education”

Haghighi, K., & Diefes-Dux, H. (2001, June), Web Based Technology For Long Term Program Assessment Paper presented at 2001 Annual Conference, Albuquerque, New Mexico. 10.18260/1-2--10015

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2001 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015