June 14, 2009
June 14, 2009
June 17, 2009
14.590.1 - 14.590.6
Using a Baldrige/Sterling Evaluation Plan for an NSF ATE Center
FLATE, the Florida Advanced Technological Education Center, a NSF ATE Regional Center of Excellence has as its mission to create a manufacturing educational delivery system by offering the technical programs, curriculum development, best practice demonstrations, student involvement and outreach activities necessary to meet the workforce capacity and high performance skill needs of the manufacturing sectors within its region. To accomplish this mission FLATE initiates and participates in a variety of projects and activities. To meet the reporting needs of the National Science Foundation and align itself with business models, FLATE has developed an Evaluation Plan that uses the impact and effectiveness data required by NSF as one component of a more comprehensive organizational self evaluation plan that is based on the Malcolm Baldrige Criteria. This approach keeps the Center’s projects and activities, data, and motivation aligned with its vision, mission and goals and target objectives.
Traditional Evaluation Plans and Their Implementation
NSF funded projects and centers focused on student recruitment and outreach, curriculum development and deployment, professional development into STEM Career pathways are all required to submit evaluation plans with their project proposals and provide annual reports of their performance data to document their activities. An informal survey of several such projects reveals that they contain many similar components including a variety of formative (periodic assessments), summative (end of project), and longitudinal data elements. Because many of the projects are broad in activity scope and focused on a single technology sector, the types of activities and when and how they are conducted might be similar, but not exactly the same. This forces many similar activities to be added together as activity data (e.g. number of students attending an outreach/promotional event). Additionally, many projects and /or centers build in a “process evaluation” provides feedback on the project implementation, timeliness, etc.
Data elements (activity data) can possibly provide information to answer some questions about broader impacts and institutional effectiveness. These research type questions draw conclusions and sometimes provide recommendations and/or best practices for various types of activities and programs. More straightforward effects on the host academic institutions faculty and teachers, programs, students and other stakeholders may also be revealed by analysis of this type of data over time. All of this requires continuous attention by the project leadership to be sure that good data is collected, recorded, filters/cleaned (if need be) and ultimately, reviewed and analyzed for reporting purposes. Below is a single pass with non feedback loops (although there are many possible) that sampling defines the various steps of an evaluation. Following these steps under the guidance of a trained professional with diligence should provide sufficient meaningful information for a comprehensive report that will satisfy the grant reporting requirements.
Barger, M., & Centonze, P., & Gilbert, R., & Roe, E., & Jenkins, B., & Wosczyna-Birch, K. (2009, June), Evaluating An Nsf Ate Center Using Baldrige Criteria Paper presented at 2009 Annual Conference & Exposition, Austin, Texas. 10.18260/1-2--4524
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2009 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015