Asee peer logo

A Process For The Direct Assessment Of Program Learning Outcomes Based On The Principles And Practices Of Software Engineering

Download Paper |

Conference

2007 Annual Conference & Exposition

Location

Honolulu, Hawaii

Publication Date

June 24, 2007

Start Date

June 24, 2007

End Date

June 27, 2007

ISSN

2153-5965

Conference Session

Emerging Trends in Engineering Education Poster Session

Page Count

12

Page Numbers

12.96.1 - 12.96.12

DOI

10.18260/1-2--2868

Permanent URL

https://peer.asee.org/2868

Download Count

592

Request a correction

Paper Authors

author page

Robert Lingard California State University-Northridge

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

A Process for the Direct Assessment of Program Learning Outcomes Based on the Principles and Practices of Software Engineering

Abstract

The Computer Science Department at California State University, Northridge (CSUN) has developed and is currently utilizing a process for the direct assessment of program learning outcomes that follows an approach similar to that used in the development of software. The software engineering steps have been applied in an iterative manner to the process of direct assessment. Before the development of this approach, the assessment of learning outcomes within the department was based primarily on indirect measures, such as student self- assessments or the subjective views of faculty, employers, and others. Additionally, when learning was assessed directly, it was usually in the context of specific courses. The needs to directly assess student learning and to assess how well it was retained over time were recognized.

The new assessment process was developed to meet the following goals. First, it needed to facilitate the direct assessment of student learning outcomes and to measure not just the skills and knowledge of students when they completed a specific course, but what they retained as they neared completion of the program. Second, it needed to be a continuous process that ensured the assessment of all program outcomes over a reasonable length of time. Third, the process should be able to be incorporated within the existing operations of the department and the activities of the individual instructors. That is, it needed to be efficient and not unduly burdensome for members of the faculty. Finally, it needed to be sufficient to satisfy the ABET requirements for assessment, as well as those of the University.

The four major steps in software development, requirements analysis, design, implementation, and validation have been applied to the assessment process. In the requirements analysis step, a determination is made of which learning outcomes to assess during the current cycle. The most important outcomes, possibly based on the results of indirect assessments or previously conducted direct assessments, are chosen. In the design step, for each selected outcome, a group of faculty consisting of all those teaching courses strongly related with respect to given outcome is charged with the responsibility of developing an assessment plan. This activity involves the selection or development of assessment instruments and rubrics. During implementation the assessment plan developed during design is carried out and the results are analyzed. Based on this analysis the validity of the results is determined. In the case of valid results, recommendations for program improvement are made as appropriate. For assessments considered invalid, recommendations for improved ways to assess the outcomes in question are made. This process then goes back to the first step for the next iteration.

The process described above has been operational for the last two years and has been proven to be very effective with respect to gathering assessment data. Both valid and invalid assessment results have been obtained using the process. Many of the successful assessments have led to important program improvements, and the unsuccessful attempts have resulted in improvements in assessment techniques.

Lingard, R. (2007, June), A Process For The Direct Assessment Of Program Learning Outcomes Based On The Principles And Practices Of Software Engineering Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--2868

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015