Montreal, Canada
June 16, 2002
June 16, 2002
June 19, 2002
2153-5965
11
7.1008.1 - 7.1008.11
10.18260/1-2--10586
https://peer.asee.org/10586
493
Main Menu Session 2258
Software for the Automated Evaluation of Web-Delivered Instruction
George Nickles, Amy Pritchett
School of Industrial and Systems Engineering Georgia Institute of Technology
Introduction
Many forms of technology have been used to mediate education between instructor and student, ranging from simple chalkboard drawings to complex intelligent tutoring systems. Recently, the advantages of the Internet, including speed of communication and use of a variety of media, have made it the focus of much educational research.
The effectiveness of Internet mediated education must be proven through evaluation. Evaluation in the context of educational systems is briefly defined as examining the effectiveness of an educational system (or component of that system) in meeting learning and teaching goals. Bloom, Hastings, and Madaus 1 give a classic, more detailed definition.
There are many measurement issues to consider when preparing an educational evaluation. One is to understand what form of evaluation is being conducted. The three major forms of evaluation are planning, formative, and summative, 2, 3 corresponding to the system's life cycle. Planning evaluation takes place early in the design phase to ensure the system is consistent with known educational theories. Formative evaluation takes place during development and implementation and is intended to drive improvement. It has been compared to quality control, as it continuously searches for weaknesses in the technology and opportunities for improvement. 1 Summative evaluation judges the effectiveness of an educational system after completing one or more cycles of operation. While all are necessary, formative evaluation is the type most-closely tied to instructors' efforts to improve the educational system.
A recent survey finds 74% of engineering instructors use the Internet to provide instructional material to students,4 adding a component to their educational systems that needs evaluating. However, the same survey also notes that only 41% of instructors who use the Internet report evaluating the Internet components of their courses. The survey results suggest some reasons for this low number, particularly noting a lack of time and evaluation support resources for instructors. There are few off-the-shelf tools for evaluating learning over the Internet, requiring instructors (or departments) to build their own to suit their needs. Instructors may not receive release time to develop such tools, may simply not have time due to other constraints, or may not have the required evaluation knowledge and programming skills to develop evaluation tools.
Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition Copyright Ó 2002, American Society for Engineering Education
Main Menu
Pritchett, A., & Nickles, G. (2002, June), Software For The Automated Evaluation Of Web Delivered Instruction Paper presented at 2002 Annual Conference, Montreal, Canada. 10.18260/1-2--10586
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2002 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015