June 26, 2011
June 26, 2011
June 29, 2011
22.337.1 - 22.337.19
In order to gain accreditation, engineering programs must define goals and objectives,assess whether their graduates are meeting these objectives, and “close the loop” by usingthe assessment data to inform continuous improvement of the program. In ABET’sjargon, program “objectives” describe capabilities that graduates are expected to possess,e.g., “Graduates of the Chemical Engineering program at XXX University will be ableto….” Thus, the true success of the program in meeting its objectives is reflected in thefirst few years of graduates’ careers. Practically speaking a program cannot be expectedto assess directly the performance of graduates with respect to these objectives, at leastnot in a comprehensive way. Consequently, programs are expected to define and assess“outcomes” which fit within the undergraduate curriculum, and which ensure, to the bestdegree possible, that graduates will meet the program objectives.A variety of assessment instruments are in common use and merits and shortcomings ofeach have been discussed in the open literature. For example, surveys and exit interviewsare commonly used, but are subjective, rely on self-assessments and likely oversimplifythe questions under examination. This paper focuses on tools for direct measurement ofstudent performance through objective evaluation of work product. Numerous authorshave outlined the assessment strategy of constructing rubrics for measuring studentachievement of learning outcomes and applying them to portfolios of student work.Other authors have outlined use of rubrics for evaluation and grading of individualassignments and projects. This paper will describe the use of a consolidated rubric forevaluating final reports in the capstone Chemical Plant Design course. Instead of gradingeach assignment and then subsequently having it evaluated a second time as a portion ofa portfolio, the instructor evaluates the report once using the rubric, and the same rawdata is used for both grading and programmatic assessment.
Dahm, K. D. (2011, June), Collecting Programmatic Assessment Data with No “Extra” Effort: Consolidated Evaluation Rubrics for Chemical Plant Design Paper presented at 2011 ASEE Annual Conference & Exposition, Vancouver, BC. 10.18260/1-2--17618
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2011 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015