Montreal, Canada
June 16, 2002
June 16, 2002
June 19, 2002
2153-5965
8
7.1018.1 - 7.1018.8
10.18260/1-2--11342
https://peer.asee.org/11342
978
Main Menu
Session 2557
Statistics for Program Assessment: Has the Program Made a Difference?
Mary R. Anderson-Rowland Arizona State University
Abstract
As funding becomes scarcer and the demand for accountability increases, creditable assessment and evaluation become more important. For example, funding is generally scarce for programs to establish and to improve activities designed to increase enrollment and retention in engineering. Therefore, almost all funding allocated to these recruitment and retention activities requires an assessment of the program to see if the money and time have been well spent.
This paper describes basic statistical concepts that should be considered when assessing a program or activity. Examples are given to illustrate both good and poor program assessment. Warnings are given for data that may turn out to be useless and suggestions presented on ways to enhance data presentation. What it takes for data to be “significant” will also be discussed, as well as the problem of sample size.
Without the proper planning of assessments and data collection, it may be very difficult to show that the program has made a difference. If a program director does not have a good statistical background, they would be well advised to have an assessment person on their team to help plan assessment strategy, to analyze the data, and to draw conclusions.
Keywords: Evaluation, Assessment, Data Analysis, Statistical Testing
I. Introduction
Many university and college budgets are strained. There is not enough money to go around to comfortably support all of the programs worthy of funding. Terms such as accountability, productivity, responsiveness, efficiency, results, impact, and leveraging are used as tough decisions are made to fund and to continue programs. Engineering schools today are engaged in many activities outside of the classroom. Major issues include recruitment, retention, graduation, and K-12 Outreach Programs. To fund these programs, tough decisions needs to be made by engineering deans on how much money goes to support outreach and retention along with hiring faculty, providing seed money for research, and buying equipment. Many engineering programs seek national funding through a government organization such as the National Science Foundation or the Department of Education. To show that the money and time will be well spent on any particular project, an assessment plan is needed. During the project and at the end of a project, a report is usually required to show that the program was successful, that a change was made, or a result was obtained. User-friendly guidebooks have been developed that describe both formative and summative assessment. 1
Proceedings of the 2002 American Society for Engineering Education annual Conference & Exposition Copyright © 2002, American Society for Engineering Education
Main Menu
Anderson-Rowland, M. (2002, June), Statistics For Program Assessment: Has The Program Made A Difference? Paper presented at 2002 Annual Conference, Montreal, Canada. 10.18260/1-2--11342
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2002 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015