Asee peer logo

A Practical and Comprehensive Approach of Assessing ABET Outcome Achievement in Computer Science and Computer Engineering

Download Paper |


2012 ASEE Annual Conference & Exposition


San Antonio, Texas

Publication Date

June 10, 2012

Start Date

June 10, 2012

End Date

June 13, 2012



Conference Session

ABET Accreditation, Assessment, and Program Improvement in ECE

Tagged Division

Electrical and Computer

Page Count


Page Numbers

25.90.1 - 25.90.19



Permanent URL

Download Count


Request a correction

Paper Authors


David Wilczynski University of Southern California

visit author page

David Wilczynski has a long history at USC. He was the first Ph.D. graduate from theUSC Information Science Institute in 1975, where some of the initial work on Arpanet was done. His research specialty at the time was in knowledge representation. In 1984, he left USC for almost 20 years to be an entrepreneur. Most of his work was in manufacturing, both in Detroit and Japan. During that time, he worked on programming real-time systems using an agent methodology, which he now teaches in his CSCI 201 class. He returned to USC in 2002 to teach full-time. Mostly, he worries about how to make undergraduate engineering students more professional. Once a tennis player, he is now trying to become a golfer. Bridge, cooking, and his family take the rest of his time.

visit author page


Gisele Ragusa University of Southern California

visit author page

Gisele Ragusa is an Associate Professor at the
University of Southern California's Viterbi School of Engineering and Rossier School of Education. She has expertise in engineering education, design pedagogy, faculty development, K-12 STEM education and assessment, measurement, and advanced research design.

visit author page

Download Paper |


A Practical and Comprehensive Approach of Assessing ABET Outcome Achievement in Computer Science and Computer Engineering AbstractEvery serious organization, and in particular academic institutions, has a vision or missionstatement. Schools of Engineering are not exempt from this charge. Though missions andvisions are easy to write, their achievement is often difficult to assess. The Accreditation Boardof Engineering and Technology (ABET) has crystallized this issue among universityundergraduate engineering programs attempting to gain accreditation. ABET requires that anundergraduate program state its educational outcomes and demonstrate how it measures outcomeachievement. Many programs, including ours in computer science, have in the past relied almostexclusively on course specific student perception surveys and other indirect methods of studentassessment. These "perceptions" have been largely discredited as biased and subjective. In recentyears, ABET has challenged the academic community to utilize assessment methodologies basedon direct, measurable data. Our response, the subject of this ASEE paper, proposes amethodology that requires professors to: (1) state their individual course outcomes and map themto the ABET program outcomes, (2) produce for each exam or assignment three importantcomponents comprising the source document, a mapping of this exam or assignment to the classoutcomes, and the results. Our 2-level outcome and assessment metrics mapping supportsprecisely the kind of outcome-achievement analysis that ABET desires. Accordingly, andperhaps because this methodology is easy to explain and interpret, we have achieved 100%outcome mapping compliance with our undergraduate teaching faculty.Methodologically, to achieve our goals of direct assessment of student learning in accordancewith ABET criteria and to articulate our program mission in alignment with ABET’s charge, wehave employed both descriptive statistics and a normalization technique of student assessmentdata analyses. An information processing approach to student achievement has served as ourtheoretical frame for which we developed our assessment metrics. This theoretical frameprovides the most robust means for which to engage in direct assessment of student achievementand to map achievement to program outcomes. Specifically, once mapping between courseobjectives, ABET outcomes and student assessment metrics was completed, resulting studentscores by outcome were summed and then averaged (our normalization technique). Theseaveraged outcome achievement scores were then mapped back course-by course. Accordingly,we created a complete mapping of ABET outcome achievement by semester that is now beingused for curriculum reform and program improvement in our computer science and computerengineering programs.This comprehensive student achievement mapping process is mutually understandable by diversefaculty and somewhat labor neutral for academic departments once the initial structures formapping, data collection, and analyses are established and support is provided. It can be adoptedand adapted to fit with diverse engineering and science academic departments as a means ofanalyzing diverse data sets and mapping student achievement to program outcomes.

Wilczynski, D., & Ragusa, G. (2012, June), A Practical and Comprehensive Approach of Assessing ABET Outcome Achievement in Computer Science and Computer Engineering Paper presented at 2012 ASEE Annual Conference & Exposition, San Antonio, Texas. 10.18260/1-2--20850

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2012 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015