Seattle, Washington
June 28, 1998
June 28, 1998
July 1, 1998
2153-5965
21
3.501.1 - 3.501.21
10.18260/1-2--7415
https://peer.asee.org/7415
317
Session 3232
Some Assessment Tools for Evaluating Curricular Innovations Outcomes
Lueny Morell de Ramírez, José L. Zayas-Castro, Jorge I. Vélez-Arocho University of Puerto Rico-Mayagüez
Abstract
One of the most critical aspects of the new ABET Engineering Criteria 2000 (EC-2000) is the existence of an outcomes assessment plan for program evaluation and continuous improvement. Outcomes assessment requires the generation of assessment tools or instruments to gather data that will document if a program’s stated goals and objectives are being met and if students have acquired identified skills.
In 1994, a partnership of universities - called the Manufacturing Engineering Education Partnership (MEEP) - initiated the design and implementation of a novel undergraduate manufacturing program, better known as the Learning Factory1,2. This paper describes how MEEP designed the assessment strategy to evaluate the curricular innovation project outcomes, and presents some of the assessment instruments/tools designed. The tools developed, some in collaboration with industrial partners, were utilized for assessing overall and specific qualitative aspects of the program as well as student performance (e.g., teamwork skills and oral presentation/written skills). A total of 9 assessment instruments are presented. We believe that the Learning Factory as well as the project’s assessment strategy and tools used comply with the new ABET Engineering Criteria 2000 (EC-2000).
Introduction
The creation and adoption of ABET’s new accreditation standards is a historic move to promote innovation and continuous improvement in engineering education3. The core of EC 2000 is an outcomes assessment component that requires engineering programs to have in place a continuous process of evaluation and feedback, to ensure the improvement of the effectiveness of the program. There are numerous resources available for the development and implementation of outcomes assessment plans. For example, Rogers and Sando have prepared a user friendly, step by step booklet that presents eight steps in developing an assessment plan4. But regardless of how the assessment plan is developed, an effective plan must start with the identification of specific goals and objectives, definition of performance criteria, followed by the data collection
1 Penn State University, University of Washington, and the University of Puerto Rico at Mayagüez in collaboration with Sandia National Laboratories. Project sponsored by the Technology Reinvestment Project. (TRP Project #3018, NSF Award #DMI- 9413880) 2 John S. Lamancusa, Jens E. Jorgensen, and José L. Zayas, The Learning Factory – A New Approach to Integrating Design and Manufacturing into Engineering Curricula. ASEE Journal of Engineering Education, Vol 86, No.2, April 1997. 3 George D. Paterson, Engineering Criteria 2000: A Bold New Change Agent, ASEE PRISM, September, 1997. 4 Gloria M. Rogers and Jean K. Sando, Stepping Ahead: An Assessment Plan Development Guide, Foundation Coalition, 1996.
de Ramírez, L. M., & Zayas, J. L., & Vélez-Arocho, J. I. (1998, June), Some Assessment Tools For Evaluating Curricular Innovations Outcomes Paper presented at 1998 Annual Conference, Seattle, Washington. 10.18260/1-2--7415
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 1998 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015