Asee peer logo

Implementing Ec2000 – Perspectives From Both Sides Of The Assessment Trench

Download Paper |


2007 Annual Conference & Exposition


Honolulu, Hawaii

Publication Date

June 24, 2007

Start Date

June 24, 2007

End Date

June 27, 2007



Conference Session

Meeting ABET Requirements

Tagged Division

Mechanical Engineering

Page Count


Page Numbers

12.842.1 - 12.842.8



Permanent URL

Download Count


Request a correction

Paper Authors


Michael Ward California State University-Chico

visit author page

Dr. Michael Ward is Associate Dean of the College of Engineering, Computer Science, and Construction Management at California State University, Chico. Dr. Ward has primary responsibility for coordinating accreditation and assessment activities among others. Dr. Ward has been a Mechanical Engineering faculty for 25 years, served as Mechanical Engineering Department Chair for 10 years, and as Associate Dean since 2001. He received his Ph.D. from Stanford University and worked for Lockheed Missiles and Space Company prior to becoming an engineering educator.

visit author page

Download Paper |

NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Implementing EC2000 – Perspectives from Both Sides of the Assessment Trench Abstract

As implementation of Engineering Criteria 2000 (EC2000) has matured so have the expectations of program assessment as viewed by many ABET - EAC program evaluators. Most engineering programs are preparing or have just completed a second visit under the new criteria and find that assessment practices considered acceptable for the first EC2000 visit no longer are. Surprises at this point are clearly not appreciated. ABET’s current concern with the EC2000 accreditation process is to make program evaluations consistent, while programs are concerned with developing assessment programs that meet ABET’s expectations but at the same time are sustainable. The process remains episodic for many programs while the intention of EC2000 is to make program assessment systematic. As both an ABET evaluator, and the individual with administrative responsibilities for accrediting five engineering programs at California State University, Chico, the author offers suggestions to help develop a clear framework for assessment activities and to help make the process sustainable. Examples of annual timelines for collection, evaluation, and overall reporting strategies based on the author’s experience are offered.


Assessment of student learning outcomes has become a fundamental part of the framework for American higher education in the 21st century. Regional accreditation agencies as well as the Accreditation Board for Engineering and Technology (ABET) are highlighting the importance of assessing student learning outcomes. Problems encountered with early EC2000 assessment programs were noted in a study initiated by the American Society of Mechanical Engineers (ASME) titled Initial Assessment of the Impact of ABET/EC2000 Implementation Using Mechanical Engineering Programs as the Pilot Study Group 1. That study lauded the extensive initial involvement of faculty in defining educational objectives, and the participation of program Advisory Boards to name a few. It also noted the shortcomings of certain assessment techniques, the failure of initial employer survey methods, and expressed concern regarding the strain that EC2000 appeared to place on program resources. In 2002 ABET commissioned a comprehensive study published in 2006, titled Engineering Change: A Study of the Impact of EC2000 2. Through careful evaluation of surveys of educators, employers, and engineering graduates, the study documents the overall positive impact that EC2000 implementation has had on engineering education. While the impact on engineering program graduates is clearly positive, it does not change the perspective of many faculty that on-going assessment programs represent a great deal of work that at times does not have an apparent connection with program improvement. In an early ABET newsletter, a column by Gloria Rogers titled “How Are We Doing?” summarized her expert opinion of our overall success in implementing EC2000 3. In her evaluation, engineering programs overall earned a “D” grade for program assessment for a number of reasons, including a slow migration from indirect to direct methods, citing that many programs continue to rely heavily on student surveys instead of direct measures of student learning. Further, Rogers pointed out that many EC2000 implementers jump from a pre-defined set of outcomes, namely ABET (a)-(k), to collecting mounds of data, without a faculty consensus on what body of evidence constitutes achievement of a given outcome within a given program.

Ward, M. (2007, June), Implementing Ec2000 – Perspectives From Both Sides Of The Assessment Trench Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--2725

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015