Honolulu, Hawaii
June 24, 2007
June 24, 2007
June 27, 2007
2153-5965
Electrical and Computer
16
12.274.1 - 12.274.16
10.18260/1-2--2946
https://peer.asee.org/2946
499
ROBERT W. SADOWSKI is an Associate Professor and the Electrical Engineering Program Director in the Department of Electrical Engineering and Computer Science at the US Military Academy at West Point. He received the Ph.D. in Electrical Engineering from Stanford University in 1995 as a Fannie and John Hertz Foundation Fellow and is a Senior Member of the Institute of Electrical and Electronic Engineers.
LISA A. SHAY is an Assistant Professor in the Department of Electrical Engineering and Computer Science at the US Military Academy at West Point. She received the M.Sc. in Engineering from Cambridge University as a Marshall Scholar in 1996, the Ph.D. in Electrical Engineering from Rensselaer Polytechnic Institute in 2002 and is a Member of ASEE and a Senior Member of the Institute of Electrical and Electronic Engineers.
CHRISTOPHER M. KORPELA is an Instructor in the Department of Electrical Engineering and Computer Science at the US Military Academy at West Point. He received his M.S. in Electrical Engineering from the University of Colorado in 2006 and is a member of the Institute of Electrical and Electronic Engineers.
ERIK J. FRETHEIM is an Assistant Professor in the Department of Electrical Engineering and Computer Science at the US Military Academy at West Point, and the CEO of Baseline Evaluations Corporation, an instruction evaluation company. He received his PhD in Electrical Engineering from the Air Force Institute of Technology, Wright-Patterson AFB in 1991, an MSEE from the Air Force Institute of Technology in 1988, and an MBA from Long Island University in 1994, and is a member of the Institute of Electrical and Electronic Engineers.
Assessing the EE Program Outcome Assessment Process
Abstract
Program outcome assessment is an integral part of systematic curriculum review and improvement. Accrediting commissions expect each student to achieve program outcomes by the time of graduation. Programs undergoing accreditation must have an assessment process that demonstrates program outcome achievement. Documenting and assessing just how graduates are meeting program outcomes can become a tedious and data intensive process. We report on our “assessment” of our assessment process that resulted in more streamlined procedures by targeting performance indicators. Our methodology included the development of a learn, practice and demonstrate model for each outcome that focuses performance indicators at the appropriate point in development. We target actual outcome achievement during the “demonstrate” phase with rubrics to detail the level of mastery on a modified Likert scale.
We originally used seventy-eight embedded performance indicators spread throughout the curriculum. We reduced to thirty indicators using a mixture of internal and external measures such as individual classroom events and fundamentals of engineering exam topical area results. We also emplaced guidelines targeting a single outcome measurement per indicator. For example, in our capstone senior design course, virtually every assignment was being reviewed by one of our outcome monitors. By targeting performance indicators at specific sub-events and looking at those which had to be assessed during the course versus indicators assessed by advisors or senior faculty, we were able to reduce the embedded performance indicators by a factor of three. We applied similar techniques to reduce individual course director workload. We have found that by streamlining the outcome process and using a rubric approach applied across multiple outcomes, we can greatly reduce the number of performance indicators yet preserve our ability to accurately assess our program. Reduced workload assessing the program has enabled us to place more effort into improving the program.
I. Introduction
Documenting, assessing and evaluating program outcome achievement can be a tedious and data intensive process. (Note that we use the term "assess" to mean the identification and collection of data and "evaluate" to mean interpretation of data. These definitions are consistent with those used by ABET1). At the United States Military Academy in West Point, NY, we recently reviewed our program assessment process to determine a more efficient way of assessing and evaluating outcome achievement without sacrificing the quality of the evaluation. Our program created outcomes and an outcome assessment process in 2000, just as the ABET EC2000 criteria were published. We were one of the early programs to be accredited under the new standards. After several years assessing under the new system, we were concerned about the time and effort our faculty spent in the outcome assessment and evaluation process. We convened a panel of senior faculty to review our assessment process and were able to reduce overhead and increase efficiency in two areas: outcomes and embedded indicators. We revised our nine program outcomes to more directly map to ABET Criterion 3: a-k while still meeting Criterion 5 and supporting our program objectives. By carefully examining how we chose
Sadowski, R., & Shay, L., & Korpela, C., & Fretheim, E. (2007, June), Assessing The Ee Program Outcome Assessment Process Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--2946
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015