Asee peer logo

Development Of Performance Criteria For Assessing Program Outcomes In Engineering, Engineering Technology & Computer Science Programs

Download Paper |

Conference

2008 Annual Conference & Exposition

Location

Pittsburgh, Pennsylvania

Publication Date

June 22, 2008

Start Date

June 22, 2008

End Date

June 25, 2008

ISSN

2153-5965

Conference Session

Accreditation Issues

Tagged Division

Mechanical Engineering

Page Count

13

Page Numbers

13.434.1 - 13.434.13

DOI

10.18260/1-2--3793

Permanent URL

https://peer.asee.org/3793

Download Count

535

Paper Authors

biography

Paul Biney Prairie View A&M University

visit author page

Dr. Paul O. Biney is a Professor in the Mechanical Engineering Department at Prairie View A&M University, and the Director of the Future Aerospace Science & Technology (FAST) Center. He is a registered professional engineer in Texas. His areas of expertise include processing, fabrication and characterization of high temperature polymer matrix composites, multifunctional nanocomposites and energy systems design. He is also the chairman of the College of Engineering Assessment Committee and oversees program outcomes assessment in the College of Engineering. Dr. Biney teaches courses in the Thermal Science area.

visit author page

biography

Raghava Kommalapati Prairie View A&M University

visit author page

Dr. Raghava Kommalapati is an Associate Professor in Civil and Environmental Engineering and a member of the College of Engineering Assessment Committee.

visit author page

biography

Michael Gyamerah Prairie View A&M University

visit author page

Dr. Michael Gyamerah is an Associate Professor in Chemical Engineering and a member of the College of Engineering Assessment Committee.

visit author page

biography

Annamalai Annamalai Prairie View A&M University

visit author page

Dr. A. Annamalai is an Associate Professor in Electrical and Computer Engineering and a member of the College of Engineering Assessment Committee.

visit author page

biography

Pamela Obiomon Prairie View A&M University

visit author page

Dr. Pamela Obiomon is an Assistant professor in Electrical and Computer Engineering and a member of the College of Engineering Assessment Committee.

visit author page

biography

Xiaobo Peng Prairie View A&M University

visit author page

Dr. Xiaobo Peng ia an Assistant Professor in the Mechanical Engineering Department and a member of the College of Engineering Assessment Committee.

visit author page

biography

Mohan Ketkar Prairie View A&M University

visit author page

Dr. Mohan Ketkar is an Assistant Professor in Engineering Technology and a member of the College of Engineering Assessment Committee.

visit author page

biography

Nripendra Sarker Prairie View A&M University

visit author page

Dr. Nripendra Sarker is a Lecturer in Engineering Technology and a member of the College of Engineering Assessment Committee.

visit author page

biography

Ravindra Iyengar Prairie View A&M University

visit author page

Mr. Ravindra Iyengar is an Assistant professor in Computer Science and a member of the College of Engineering Assessment Committee.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Development of Performance Criteria for Assessing Program Outcomes in Engineering, Engineering Technology and Computer Science Programs

Abstract This paper presents the development and the use of performance criteria that could be used for detailed assessment of specific students’ performance in the program outcomes listed for Engineering programs (EAC Criterion 3, a-k outcomes), Engineering Technology programs (TAC Criterion 2, a-k outcomes) and Computer Science programs (CAC criterion 1, a-i outcomes). Performance criteria have been used to break down each program outcome into concrete measurable actions students are expected to be able to perform to demonstrate proficiency in the outcome. For each of the listed outcomes for the ABET Accreditation bodies, detailed performance criteria are presented in this paper. Suggestions on how the performance criteria can be used in a program are described in detail to allow selective adoption of the performance criteria for different programs and for different courses. The methodology for defining and using the performance criteria enables faculty to (1) fully understand the outcomes, (2) understand a range of performance criteria that need to be measured for each outcome, and (3) remove any ambiguity in the interpretation of the outcomes. In addition, it makes it possible to identify the critical skill-sets to measure for each outcome and makes assessment meaningful to the various programs.

Introduction In the advent of EC 2000, Engineering, Engineering Technology and Computer Science programs have grappled with methods for assessing the ABET outcomes, especially those skills which are not taught in the traditional programs. Even though several assessment methods have been published in the literature(1, 2, 3, 4) for assessing outcomes, there is still a need to establish concrete performance criteria for the outcomes to make the interpretation of assessment results meaningful. Richard Felder and Rebecca Brent5 have provided useful references that provide additional suggestions for defining performance criteria for the outcomes discussed in this paper.

Performance criteria are specific measurable statements that indicate the actions or competencies students should be able to perform or possess at the end of the measurement period. Defining performance criteria for each program outcome is important because it (1) delineates specific statements that identify concrete measurable actions students should be able to perform to meet the outcome, (2) clearly states what needs to be measured, (3) provides common understanding among the faculty on the interpretation of an outcome, thereby removing any ambiguity in the interpretation of an outcome, (4) informs students of the expectations from the outcome, (5) provides focus on the type of data to be collected, (6) provides validity to the assessment results, (7) clearly identifies specific problem areas to be addressed as a result of the assessment process.

To ensure that the performance criteria developed can be used by different programs, they were developed based on the program outcomes for Engineering (ABET Criterion 3, a-k outcomes), Engineering Technology (TAC Criterion 2, a-k outcomes) and Computer Science (CAC criterion 1, a-i outcomes). The program outcomes from the three ABET Accreditation Commissions were analyzed and grouped based on similarities. The performance criteria were developed for each

Biney, P., & Kommalapati, R., & Gyamerah, M., & Annamalai, A., & Obiomon, P., & Peng, X., & Ketkar, M., & Sarker, N., & Iyengar, R. (2008, June), Development Of Performance Criteria For Assessing Program Outcomes In Engineering, Engineering Technology & Computer Science Programs Paper presented at 2008 Annual Conference & Exposition, Pittsburgh, Pennsylvania. 10.18260/1-2--3793

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2008 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015