June 14, 2009
June 14, 2009
June 17, 2009
Two Year College Division
14.412.1 - 14.412.6
Design and Implementation of Scoring Rubrics for Technical Courses in Two-Year Colleges
The process of assessment is to measure student performance. Instructors need to make sure that the assignments are scored as objectively as possible when evaluating a project. A rubric helps to set clear expectations and defines the quality of work for a given project. Descriptive scoring schemes have become a common method for evaluating course content. The descriptive scale supports the evaluation of the criteria set for each project. The focus of this paper is the design and implementation of scoring rubric models for technical courses in two-year colleges. The major points of this paper include identifying common definitions of assessment, identifying specific observable attributes in evaluating student performance, defining and brainstorming characteristics that describe each attribute, and the designing and implementation of scoring rubrics for a technical course. The following steps are involved in developing scoring rubrics: defining and listing learning objectives for technical courses, identifying the specific attributes that students should demonstrate in their performance, identifying each attribute and its characteristics, and identifying the excellent and poor quality work using narrative descriptive criteria. Holistic rubrics and analytical rubrics are both used to measures students understanding of course content. Holistic rubrics provide a choice to state the highest and lowest levels of performance combining the descriptors for all attributes and analytical rubrics state the highest and lowest levels of performance using the descriptions for each attribute separately. The use of rubrics allows the instructor to provide quality feed back to the student along with providing evaluation and reflection opportunities for an instructor as well. The use of rubrics in a technical program will provide accountability and evaluation that is beneficial to both students and college program.
Designing a valuable assessment to measure students’ understanding of course content is both challenging and complicated. Especially, if designed prior to establishing clear course objectives, goals, and expectations. The purpose of assessments is to reinforce the accountability of both the student and instructor. This gives the student an opportunity to demonstrate their full potential, capacity, and ability to internalize, process, and apply the presented course content. Furthermore, the instructor has provided an opportunity to process results and determine student strengths and weaknesses within the course. This is a beneficial instrument to generate purposeful feedback for students to recognize their misconceptions and reconceptualize these deficiencies within the course content. At the same time, instructors reevaluate their educational techniques and improve their effectiveness, to better steer students away from common misconceptions. The technical college instructors can take this evaluation piece a step further by incorporating the rubric.
The purpose of the rubric is to outline the assessment’s expectations with required course related details. Typically, a rubric assigns a score range based on student performance of
Heidari, F. (2009, June), Design And Implementation Of Scoring Rubrics For Technical Courses In Two Year Colleges Paper presented at 2009 Annual Conference & Exposition, Austin, Texas. https://peer.asee.org/4900
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2009 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015