Asee peer logo

Rubric Development And Inter Rater Reliability Issues In Assessing Learning Outcomes

Download Paper |


2002 Annual Conference


Montreal, Canada

Publication Date

June 16, 2002

Start Date

June 16, 2002

End Date

June 19, 2002



Conference Session

Assessment in Large and Small Programs

Page Count


Page Numbers

7.991.1 - 7.991.8



Permanent URL

Download Count


Request a correction

Paper Authors

author page

James Newell

author page

Heidi Newell

author page

Kevin Dahm

Download Paper |

NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Main Menu

Session 2613

Rubric Development and Inter-Rater Reliability Issues in Assessing Learning Outcomes 1 James A. Newell, 1Kevin D. Dahm, and 2Heidi L. Newell 1 Department of Chemical Engineering/ 2College of Engineering Rowan University, Glassboro, NJ 08028

Abstract This paper describes the development of rubrics that help evaluate student performance and relate that performance directly to the educational objectives of the program. Issues in accounting for different constituencies, selecting items for evaluation, and minimizing time required for data analysis are discussed. Aspects of testing the rubrics for consistency between different faculty raters are presented, as well as a specific example of how inconsistencies were addressed. Finally, a considerat ion of the difference between course and programmatic assessment and the applicability of rubric development to each type is discussed.

Introduction With the increased emphasis placed by ABET (1) on assessing learning outcomes, many faculty struggle to develop meaningful assessment instruments. In developing these instruments, the faculty members in the Chemical Engineering Department at Rowan University wanted to ensure that each instrument addressed the three fundamental program tasks as specified by Diamond (2): · The basic competencies for all students must be stated in terms that are measurable and demonstrable · A comprehensive plan must be developed to ensure that basic competencies are learned and reinforced throughout the time the students are enrolle d in the institution · Each discipline must specify learning outcomes congruent with the required competencies Like many institutions (3), the Rowan University Chemical Engineering Department chose to use items that address multiple constituencies including alumni, industry, and the students themselves. Assessment data from these groups were obtained through alumni surveys, student peer-reviews, and employer surveys. These instruments were fairly straightforward to design and could be mapped directly to ABE T A-K as well as the AIChE requirements and other department specific goals. The difficulty arose when the discussion turned to student portfolios. As Rogers (4) observes, there is no one correct way to design a portfolio process. Essentially everyone agreed that a portfolio should contain representative samples of student work gathered primarily from courses taken in the junior and senior years. The ABET educational objectives are summative rather than formative in nature, so the faculty decided to foc us on work generated near the end of the student’s undergraduate career. A variety of assignments would be required to ensure that all of the diverse criteria required by

Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition Copyright Ó 2002, American Society for Engineering Education

Main Menu

Newell, J., & Newell, H., & Dahm, K. (2002, June), Rubric Development And Inter Rater Reliability Issues In Assessing Learning Outcomes Paper presented at 2002 Annual Conference, Montreal, Canada. 10.18260/1-2--10943

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2002 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015