Asee peer logo

Direct Assessment Measures

Download Paper |

Conference

2007 Annual Conference & Exposition

Location

Honolulu, Hawaii

Publication Date

June 24, 2007

Start Date

June 24, 2007

End Date

June 27, 2007

ISSN

2153-5965

Conference Session

ChE: Assessment

Tagged Division

Chemical Engineering

Page Count

11

Page Numbers

12.548.1 - 12.548.11

DOI

10.18260/1-2--1537

Permanent URL

https://peer.asee.org/1537

Download Count

554

Paper Authors

biography

Joseph Shaeiwitz West Virginia University

visit author page

Joseph A. Shaeiwitz received his B.S. degree from the University of Delaware and his M.S. and Ph.D. degrees from Carnegie Mellon University. His professional interests are in design, design education, and outcomes assessment. Joe is an associate editor of the Journal of Engineering Education, and he is a co-author of the text Analysis, Synthesis, and Design of Chemical Processes (2nd ed.), published by Prentice Hall in 2003.

visit author page

biography

Daina Briedis Michigan State University

visit author page

DAINA BRIEDIS is a faculty member in the Department of Chemical Engineering and Materials Science at Michigan State University. Dr. Briedis has conducted research in bioadhesion; she is currently studying development of integrated approaches to using computation tools to support technical problem solving throughout the curriculum. She is active nationally and internationally in engineering accreditation, is an ABET IDEAL Scholar, and is a member of the ABET Board. She leads the assessment and evaluation efforts in her program.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Direct Assessment Measures Introduction

Engineering programs have recently completed or are in the process of preparing for their second accreditation visit under the assessment-based criteria. Based on our combined experience of over 30 visits as both program evaluators and team chairs, it appears that many programs are struggling to identify valid measures for their program outcomes to come in full compliance with the requirements of Criterion 3 of the Engineering Criteria. This is substantiated by evidence of the relatively large number of citations for shortcomings relative to some aspect of this criterion.1 One cause of this is that many programs rely very heavily on surveys and similar indirect, or “soft,” measures of these outcomes. We believe that there is too much reliance on these indirect assessment measures and programs should endeavor to make direct assessment a cornerstone of their program improvement processes.

Indirect assessment measures include end-of-course surveys, graduation surveys, and alumni surveys in which an evaluation is based on opinion or self-reporting. These assessment measures are necessary but not sufficient, and some may be more appropriate for evaluation of objectives rather than demonstration of outcomes. Direct assessment by the faculty is necessary to provide an objective measure of students’ achievement of educational outcomes. All quality assessment plans should include direct assessment measures.

ABET is taking a harder line on the need for direct assessment as part of demonstrating that students have the required skills, knowledge, and attitudes comprised in the eleven ABET- designated outcomes. ABET has received significant negative feedback from engineering programs about the inconsistencies in program evaluator findings for compliance with Criterion 3. While it is true that, in the past, programs may have passed through accreditation visits with only indirect assessment (surveys, faculty opinions, ad hoc data) as evidence of student achievement in outcomes, it is likely that this will no longer be the case. Programs relying only on indirect measures for outcomes assessment will likely be cited with shortcomings in future accreditation visits.

This paper deals mainly with assessment of program outcomes—the knowledge, skills and attributes that students should demonstrate by the time of graduation. Therefore, the focus is campus- or curriculum-based assessment. This paper does not address program educational objectives, which describe career and professional accomplishments of program alumni. The former usually requires different evaluation and assessment tools than the latter; although, some overlap does exist. In addition, we will also refer to course objectives, which are not to be confused with program educational objectives.

Therefore, given this context and the imminent importance of using direct assessment methods, this paper provides a review of direct assessment measures. These methods include, but are not limited to, instructor end-of-course assessments, use of targeted assignments (assigned problems, exam questions, projects), capstone examinations (including the FE Exam), student portfolios, and use of capstone experiences.

Shaeiwitz, J., & Briedis, D. (2007, June), Direct Assessment Measures Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--1537

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015