June 20, 2010
June 20, 2010
June 23, 2010
New Engineering Educators
15.966.1 - 15.966.13
PRACTICAL, EFFICIENT STRATEGIES FOR ASSESSMENT OF ENGINEERING PROJECTS AND ENGINEERING PROGRAMS
Abstract The process of seeking and gaining accreditation for an engineering program was substantially changed ten years ago when the EC2000 criteria were implemented. (The moniker EC2000 is no longer in use; they are now simply the ABET criteria.) Programs must now define goals and objectives for their program, provide evidence that graduates are meeting these objectives, and demonstrate evidence of continuous improvement. These accreditation criteria present programs with significant challenges. Departments must determine what data are needed and collect it regularly. To be sustainable, assessment plans must make efficient use of faculty time. This paper will present strategies for collecting assessment data that serves multiple purposes beyond accreditation, using the Rowan University Junior/Senior Engineering Clinic as an example.
The Rowan University Junior/Senior Engineering Clinic is a multidisciplinary, project- based course required for engineering students in all disciplines. Students solve real engineering research and design problems, many of which are sponsored by local industry. Because each clinic project is unique, grading student work and maintaining approximately uniform expectations across all projects is a significant challenge. At the same time, the Clinic is the course within the Rowan Engineering curriculum that best reflects professional engineering practice. Consequently, the Junior/Senior Clinic provides an excellent forum for assessing whether students have indeed achieved the desired pedagogical outcomes of the curriculum. This paper will present a set of assessment rubrics that is currently being used by the Rowan Chemical Engineering department. The data collected serves two purposes: It is used to grade individual student projects and it is used for program-level assessment.
The assessment strategies presented are of potential utility to any engineering faculty member, but may be of particular interest to new faculty members, for whom research productivity and generation of publications are essential. This paper will present evidence that the implementation of the assessment process led directly to improved student performance in the Jr/Sr Clinic, and thus improved the overall research productivity of the entire department. Further, new faculty members often have innovative ideas for classroom teaching. This paper will demonstrate how the assessment rubrics have been used as a tool for turning pedagogical innovations into publishable pedagogical scholarship.
Programmatic Assessment for Engineering Background Since 2000, ABET1 has required that in order to be accredited, engineering programs must demonstrate evidence of continuous assessment and continuous improvement. Components of a good assessment strategy include:
Dahm, K. (2010, June), Practical, Efficient Strategies For Assessment Of Engineering Projects And Engineering Programs Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16438
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015