June 24, 2017
June 24, 2017
June 28, 2017
Educational Research and Methods
Generally most engineering programs employ methods for developing learning outcomes, performance indicators and their rubrics that primarily focus on the fulfillment of accreditation requirements. Therefore, a relatively small set of learning outcomes with generic performance indicators and their rubrics that are broad in application are usually developed at the program or course level. Typically engineering specializations teach students several hundred specific activities throughout the course of their curriculum delivery. Generic rubrics obviously present difficulty in characterizing these specific student learning activities and their skills levels. They cannot be accurately applied to assessment, scoring of specific student learning activities and create issues for inter or intra-rater reliability. Due to their broad meaning and application generic performance indicators do not easily facilitate scientific constructive alignment and performance failure analysis is usually conducted with additional resources spent and thorough objective evidence examination. This contributes to procedures that do not result in on time remedial actions for continuous quality improvement. Generic performance indicators do not adequately support holistic curriculum deliveries targeting coverage of all three Bloom’s learning domains and their learning levels. Manual processes, methodology that does not support automation, streamlining and lack of utilization of digital technology are the main factors that compel programs to develop generic, broad performance indicators and their rubrics.
In this research, we present the essential principles of an authentic outcome based educational model related to the development of learning outcomes, performance indicators and their rubrics with a focus on measurement of specific skills related to Bloom’s 3 learning domains and their learning levels for engineering specializations. An analysis of culminating ABET Engineering Accreditation Commission student outcomes is made with reference to Bloom’s 3 learning domains and their learning levels. A hypothetical model is presented for this analysis. The correlation of ABET student outcomes, course learning outcomes and performance indicators is clearly outlined. The necessity of the use of performance indicators is highlighted especially in reference to the measurement of course learning outcomes, development of assessments, teaching and learning activities. The importance of scientific constructive alignment of learning outcomes, performance indicators, assessments, teaching and learning strategies is discussed. A novel hybrid rubric for accurate assessment and scoring of student performances is also presented. Actual examples of implementation of this theory to program, course and student level performance evaluations using state of the art web based digital technology are shown. In summary, the benefits of specific performance indicators over generic ones are explained in detail with respect to support of authentic OBE principles, scientific constructive alignment, accurate measurement of student performances in specific engineering learning activities, performance failure analysis and continuous quality improvement.
Hussain, W., & Spady, W. G. (2017, June), Specific, Generic Performance Indicators and Their Rubrics for the Comprehensive Measurement of ABET Student Outcomes Paper presented at 2017 ASEE Annual Conference & Exposition, Columbus, Ohio. 10.18260/1-2--28837
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2017 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015