Asee peer logo

Collecting Programmatic Assessment Data with No “Extra” Effort: Consolidated Evaluation Rubrics for Chemical Plant Design

Download Paper |

Conference

2011 ASEE Annual Conference & Exposition

Location

Vancouver, BC

Publication Date

June 26, 2011

Start Date

June 26, 2011

End Date

June 29, 2011

ISSN

2153-5965

Conference Session

ABET and Curriculum-Level Assessments

Tagged Division

Chemical Engineering

Page Count

19

Page Numbers

22.337.1 - 22.337.19

DOI

10.18260/1-2--17618

Permanent URL

https://peer.asee.org/17618

Download Count

597

Request a correction

Paper Authors

biography

Kevin D. Dahm Rowan University

visit author page

Kevin Dahm is an Associate Professor of Chemical Engineering at Rowan University. He received his B.S. from WPI in 1992 and his Ph.D. from MIT in 1998. He has published on teaching engineering design, assessment of student learning, and use of process simulation in undergraduate education. He is the recipient of the 2004 Fahien Award and the 2010 Mid-Atlantic Section Outstanding Teaching Award from ASEE.

visit author page

Download Paper |

Abstract

In order to gain accreditation, engineering programs must define goals and objectives,assess whether their graduates are meeting these objectives, and “close the loop” by usingthe assessment data to inform continuous improvement of the program. In ABET’sjargon, program “objectives” describe capabilities that graduates are expected to possess,e.g., “Graduates of the Chemical Engineering program at XXX University will be ableto….” Thus, the true success of the program in meeting its objectives is reflected in thefirst few years of graduates’ careers. Practically speaking a program cannot be expectedto assess directly the performance of graduates with respect to these objectives, at leastnot in a comprehensive way. Consequently, programs are expected to define and assess“outcomes” which fit within the undergraduate curriculum, and which ensure, to the bestdegree possible, that graduates will meet the program objectives.A variety of assessment instruments are in common use and merits and shortcomings ofeach have been discussed in the open literature. For example, surveys and exit interviewsare commonly used, but are subjective, rely on self-assessments and likely oversimplifythe questions under examination. This paper focuses on tools for direct measurement ofstudent performance through objective evaluation of work product. Numerous authorshave outlined the assessment strategy of constructing rubrics for measuring studentachievement of learning outcomes and applying them to portfolios of student work.Other authors have outlined use of rubrics for evaluation and grading of individualassignments and projects. This paper will describe the use of a consolidated rubric forevaluating final reports in the capstone Chemical Plant Design course. Instead of gradingeach assignment and then subsequently having it evaluated a second time as a portion ofa portfolio, the instructor evaluates the report once using the rubric, and the same rawdata is used for both grading and programmatic assessment.

Dahm, K. D. (2011, June), Collecting Programmatic Assessment Data with No “Extra” Effort: Consolidated Evaluation Rubrics for Chemical Plant Design Paper presented at 2011 ASEE Annual Conference & Exposition, Vancouver, BC. 10.18260/1-2--17618

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2011 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015