Asee peer logo

Practical, Efficient Strategies For Assessment Of Engineering Projects And Engineering Programs

Download Paper |

Conference

2010 Annual Conference & Exposition

Location

Louisville, Kentucky

Publication Date

June 20, 2010

Start Date

June 20, 2010

End Date

June 23, 2010

ISSN

2153-5965

Conference Session

Been There, Done That: Advice for New Faculty

Tagged Division

New Engineering Educators

Page Count

13

Page Numbers

15.966.1 - 15.966.13

DOI

10.18260/1-2--16438

Permanent URL

https://peer.asee.org/16438

Download Count

368

Request a correction

Paper Authors

biography

Kevin Dahm Rowan University

visit author page

Kevin Dahm is an Associate Professor of Chemical Engineering at Rowan University. He received his B.S. from Worcester Polytechnic Institute in 1992 and his Ph.D. from Massachusetts Institute of Technology in 1998. He has published in the areas of engineering design, pedagogically sound uses for simulation and computing, assessment of student learning, and teaching engineering economy. He has received four ASEE awards: the 2002 PIC-III award, the 2003 Joseph J. Martin Award, the 2004 Raymond W. Fahien Award and the 2005 Corcoran Award.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

PRACTICAL, EFFICIENT STRATEGIES FOR ASSESSMENT OF ENGINEERING PROJECTS AND ENGINEERING PROGRAMS

Abstract The process of seeking and gaining accreditation for an engineering program was substantially changed ten years ago when the EC2000 criteria were implemented. (The moniker EC2000 is no longer in use; they are now simply the ABET criteria.) Programs must now define goals and objectives for their program, provide evidence that graduates are meeting these objectives, and demonstrate evidence of continuous improvement. These accreditation criteria present programs with significant challenges. Departments must determine what data are needed and collect it regularly. To be sustainable, assessment plans must make efficient use of faculty time. This paper will present strategies for collecting assessment data that serves multiple purposes beyond accreditation, using the Rowan University Junior/Senior Engineering Clinic as an example.

The Rowan University Junior/Senior Engineering Clinic is a multidisciplinary, project- based course required for engineering students in all disciplines. Students solve real engineering research and design problems, many of which are sponsored by local industry. Because each clinic project is unique, grading student work and maintaining approximately uniform expectations across all projects is a significant challenge. At the same time, the Clinic is the course within the Rowan Engineering curriculum that best reflects professional engineering practice. Consequently, the Junior/Senior Clinic provides an excellent forum for assessing whether students have indeed achieved the desired pedagogical outcomes of the curriculum. This paper will present a set of assessment rubrics that is currently being used by the Rowan Chemical Engineering department. The data collected serves two purposes: It is used to grade individual student projects and it is used for program-level assessment.

The assessment strategies presented are of potential utility to any engineering faculty member, but may be of particular interest to new faculty members, for whom research productivity and generation of publications are essential. This paper will present evidence that the implementation of the assessment process led directly to improved student performance in the Jr/Sr Clinic, and thus improved the overall research productivity of the entire department. Further, new faculty members often have innovative ideas for classroom teaching. This paper will demonstrate how the assessment rubrics have been used as a tool for turning pedagogical innovations into publishable pedagogical scholarship.

Programmatic Assessment for Engineering Background Since 2000, ABET1 has required that in order to be accredited, engineering programs must demonstrate evidence of continuous assessment and continuous improvement. Components of a good assessment strategy include:

Dahm, K. (2010, June), Practical, Efficient Strategies For Assessment Of Engineering Projects And Engineering Programs Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16438

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015