Asee peer logo

Specific, Generic Performance Indicators and Their Rubrics for the Comprehensive Measurement of ABET Student Outcomes

Download Paper |


2017 ASEE Annual Conference & Exposition


Columbus, Ohio

Publication Date

June 24, 2017

Start Date

June 24, 2017

End Date

June 28, 2017

Conference Session

Understanding the Discipline of Engineering

Tagged Division

Educational Research and Methods

Tagged Topic


Page Count




Permanent URL

Download Count


Request a correction

Paper Authors

visit author page

Wajid Hussain is an enthusiastic, productive Electrical/Computer Engineer with a Master of Science Degree coupled with more than 15 years Engineering experience and Mass Production expertise of Billion Dollar Microprocessor Manufacture Life Cycle.

Over the years Wajid has managed several projects related to streamlining operations with utilization of state of the art technology and digital systems.

This has given him significant experience working with ISO standard quality systems.

He has received specialized Quality Leadership Training at LSI Corporation and received an award LSI Corporation Worldwide Operations Review 1999 for his significant contributions to the Quality Improvement Systems.

He is a specialist on ABET accreditation procedures and was appointed by the Dean of Engineering Dr. Mubarak Mutairi KFUPM Hafr Al Batin campus to lead the intensive effort of preparing the EEET program for the ABET Evaluators Team site visit in 2013. EEET received excellent comments for the display materials presented by Dr. Subal Sarkar ABET team chair which was managed to completion by Wajid.

Wajid is also certified as an expert in outcomes assessment & EvalTools® 6 by MAKTEAM Inc. USA.

He is a member of ASEE, SAP Community, ISO 9001, Senior Member IEEE, IEEE Qatar and REED MEP professionals International & Middle East. He has taught courses on Electric, Electronics & Digital Circuits; Microprocessors; Instrumentation & Measurements.

Wajid Hussain is now Director, Office of Quality & Accreditation at the Faculty of Engineering, Islamic University Madinah Munawarrah campus.

He is member of the Quality and Accreditation Committee for the Faculty of Engineering.

He has been a speaker on outcomes assessment and automation at ASEE, FIE, ICA, OBE ICON, MTN conferences.

He has conducted several workshops at the IU campus on Outcomes Assessment best practices, OBE, EvalTools® 6 for faculty, E learning with EvalTools® 6 for students, ABET accreditation process.

He is Digital Integrated Quality Management Systems Expert for Automated Academic Student Outcomes based Assessments Methodology specializing in EvalTools® 6 by MAKTEAM Inc.

visit author page


William G. Spady International Network for OBE

visit author page

Dr. Spady has been a leading pioneer in Outcome Based thinking and implementation for 45 years. As a Ph.D. graduate of the U. of Chicago in 1967, he was introduced to the seminal work of Benjamin Bloom in 1968 and transformed its fundamentals into a comprehensive paradigm-shifting system of educational transformation that he has shared through his 8 books, dozens of published papers, and countless presentations and workshops to educational institutions on 4 continents. He regards OBE as a powerful, future-focused ever-evolving approach to learner empowerment, and regrets that it has been so badly misunderstood and misrepresented across the world.

visit author page

Download Paper |


Generally most engineering programs employ methods for developing learning outcomes, performance indicators and their rubrics that primarily focus on the fulfillment of accreditation requirements. Therefore, a relatively small set of learning outcomes with generic performance indicators and their rubrics that are broad in application are usually developed at the program or course level. Typically engineering specializations teach students several hundred specific activities throughout the course of their curriculum delivery. Generic rubrics obviously present difficulty in characterizing these specific student learning activities and their skills levels. They cannot be accurately applied to assessment, scoring of specific student learning activities and create issues for inter or intra-rater reliability. Due to their broad meaning and application generic performance indicators do not easily facilitate scientific constructive alignment and performance failure analysis is usually conducted with additional resources spent and thorough objective evidence examination. This contributes to procedures that do not result in on time remedial actions for continuous quality improvement. Generic performance indicators do not adequately support holistic curriculum deliveries targeting coverage of all three Bloom’s learning domains and their learning levels. Manual processes, methodology that does not support automation, streamlining and lack of utilization of digital technology are the main factors that compel programs to develop generic, broad performance indicators and their rubrics.

In this research, we present the essential principles of an authentic outcome based educational model related to the development of learning outcomes, performance indicators and their rubrics with a focus on measurement of specific skills related to Bloom’s 3 learning domains and their learning levels for engineering specializations. An analysis of culminating ABET Engineering Accreditation Commission student outcomes is made with reference to Bloom’s 3 learning domains and their learning levels. A hypothetical model is presented for this analysis. The correlation of ABET student outcomes, course learning outcomes and performance indicators is clearly outlined. The necessity of the use of performance indicators is highlighted especially in reference to the measurement of course learning outcomes, development of assessments, teaching and learning activities. The importance of scientific constructive alignment of learning outcomes, performance indicators, assessments, teaching and learning strategies is discussed. A novel hybrid rubric for accurate assessment and scoring of student performances is also presented. Actual examples of implementation of this theory to program, course and student level performance evaluations using state of the art web based digital technology are shown. In summary, the benefits of specific performance indicators over generic ones are explained in detail with respect to support of authentic OBE principles, scientific constructive alignment, accurate measurement of student performances in specific engineering learning activities, performance failure analysis and continuous quality improvement.

Hussain, W., & Spady, W. G. (2017, June), Specific, Generic Performance Indicators and Their Rubrics for the Comprehensive Measurement of ABET Student Outcomes Paper presented at 2017 ASEE Annual Conference & Exposition, Columbus, Ohio. 10.18260/1-2--28837

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2017 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015