Honolulu, Hawaii
June 24, 2007
June 24, 2007
June 27, 2007
2153-5965
Engineering Technology
9
12.63.1 - 12.63.9
10.18260/1-2--1973
https://peer.asee.org/1973
1786
A Methodology for Direct Assessment of Student Attainment of Program Outcomes
Abstract
While not directly required in Criterion 3 of the ABET accreditation criteria for engineering technology programs, some form of direct assessment of student attainment of program outcomes is generally expected. Unfortunately, direct assessment can be overlooked by program faculty, often leading to an over reliance on indirect assessments such as surveys. This paper describes a successful methodology for faculty-driven, direct assessment of student attainment of program outcomes. The method does not require sophisticated technology or require students to create or maintain portfolios. The system provides student attainment of program outcome data by course, thus enabling curriculum improvement. Also, the system directly links examples of student work to program outcomes—a significant advantage as it melds the old and new accreditation requirements regarding student work samples. The method configures the materials used by the faculty each semester for their assessment of outcomes in the same format as viewed by ABET evaluators during a visit. Thus, the assessment process is institutionalized and last minute ABET visit preparation minimized.
Introduction
The assessment of student attainment of program outcomes as required by ABET accreditation criteria presents challenges for engineering education programs. Criterion 3 of the 2007/2008 criteria for accrediting engineering technology programs states that programs must demonstrate that student assessments are being used as part of a broad, documented continuous improvement process. In addition, multiple assessment methods are to be used to “triangulate” data to ensure that program outcomes and objectives are being met. The Criteria goes on to suggest possible assessment methods, including “student portfolios, student performance in project work and activity-based learning; results of integrated curricular experiences; relevant nationally-normed examinations; results of surveys to assess graduate and employer satisfaction with employment, career development, career mobility, and job title; and preparation for continuing education”1. The details of these assessment procedures are left to the discretion of each institution. Using data from employer and graduate surveys is convenient because the results can be quantified and someone other than the faculty does the work of completing the surveys.
However, while not directly required in Criterion 3, some form of direct assessment of student attainment of program outcomes is generally expected. Unfortunately, direct assessment can be overlooked by program faculty, often leading to an over reliance on survey information or other indirect measures. In 2004, Gloria Rogers stated in her “How are we doing?” article in the ABET Communications Link2, “Most educational units are still depending on indirect methods such as surveys .... This is the most serious flaw in the current data collection methods observed ...” Anecdotal evidence gained from talking to individuals in various engineering technology programs indicates that direct versus indirect assessment is still a significant issue among engineering technology programs. The ABET web site provides information about assessment
Danielson, S., & Rogers, B. (2007, June), A Methodology For Direct Assessment Of Student Attainment Of Program Outcomes Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--1973
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015