Asee peer logo

Direct Assessment Of Program Outcomes In A Computer Science And Engineering Program

Download Paper |

Conference

2009 Annual Conference & Exposition

Location

Austin, Texas

Publication Date

June 14, 2009

Start Date

June 14, 2009

End Date

June 17, 2009

ISSN

2153-5965

Conference Session

ECE Pedagogy and Assessment II

Tagged Division

Electrical and Computer

Page Count

17

Page Numbers

14.493.1 - 14.493.17

DOI

10.18260/1-2--4730

Permanent URL

https://peer.asee.org/4730

Download Count

335

Request a correction

Paper Authors

biography

Neelam Soundarajan Ohio State University

visit author page

Neelam Soundarajan is an Associate Professor in the CSE Dept. at the Ohio State University. His technical interests are in Software Engineering, Programming Languages, and in issues related to
engineering education, including program assessment and improvement.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Direct Assessment of Program Outcomes in a Computer Science and Engineering Program

Abstract

Although direct assessment of program outcomes is not an explicitly specified requirement of Engineering Accreditation Criteria (EC), most program evaluators expect to see some use of such assessments. This is not surprising since direct assessments provide the most reliable evaluations of actual achievement, by the students, of the various outcomes. At the same time, programs have struggled to come up with direct assessment mechanisms that are not resource-intensive (in terms of faculty and administration time) and provide useful results that lead to specific, documentable program improvements. In this paper, we report on such a mechanism that is both powerful in terms of its ability to identify specific program improvements and, at the same time, requires only minimal resources to administer and sustain on a long term basis.

1. Introduction

Prados, Peterson and Lattuca, in their article 15 tracing the history and evolution of engineering education and accreditation criteria through the twentieth century, write: “By the late 1980s, . . . engineering practice was changing dramatically and irreversibly . . . [existing programs] produced graduates with strong technical skills, but these graduates were not nearly so well prepared in other skills needed to develop and manage innovative technology . . . engineering accreditation had become an impediment to reform . . . criteria were increasingly prescriptive . . . institutions that attempted flexible and innovative programs were increasingly harassed in accreditation reviews and were forced to make their curricular requirements more restrictive.”

Based on these considerations, Prados et al. note, a new set of criteria for evaluating engineering programs, Engineering Criteria 2000 (EC) 1 , was created. The specification of curricular content was significantly reduced in the new criteria. Instead, each program was required to identify a set of program objectives, tailored to the individual program, and a corresponding set of outcomes, including the set of twelve outcomes (3.a) through (3.k) specified as part of Criterion 3 of EC. At the core of EC is the requirement of a continuous improvement process based on assessing the degree to which graduates of the program achieved the program’s outcomes and using the assessment results to drive program improvements. EC also requires clear documentation of the assessment processes used, the assessment results, and the improvements based on these results.

While the curricular flexibility provided by EC has been widely welcomed, many programs have struggled to meet the requirements regarding suitable assessment processes and documented im- provements based on the results of the assessment of their outcomes. One key problem has been the question of developing suitable mechanisms for direct assessment of program outcomes. A direct

Soundarajan, N. (2009, June), Direct Assessment Of Program Outcomes In A Computer Science And Engineering Program Paper presented at 2009 Annual Conference & Exposition, Austin, Texas. 10.18260/1-2--4730

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2009 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015