Asee peer logo

A Network Based Multimedia Computerized Testing Tool

Download Paper |

Conference

1997 Annual Conference

Location

Milwaukee, Wisconsin

Publication Date

June 15, 1997

Start Date

June 15, 1997

End Date

June 18, 1997

ISSN

2153-5965

Page Count

8

Page Numbers

2.28.1 - 2.28.8

DOI

10.18260/1-2--6704

Permanent URL

https://peer.asee.org/6704

Download Count

571

Request a correction

Paper Authors

author page

Il-Hong Jung

author page

Hosoon Ku

author page

D. L. Evans

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Session 3530

A Network-Based Multimedia Computerized Testing Tool+

Il-Hong Jung, Hosoon Ku, and D. L. Evans* Center for Innovation in Engineering Education College of Engineering and Applied Sciences Arizona State University Tempe, AZ 85287-6106 email: {ijung, hosoon, devans}@asu.edu

Abstract

In this paper, we describe a network-based, multimedia, Quizzer or testing tool that has been developed for authoring and delivering electronic quizzes/tests. We demonstrate this tool and compare it with traditional paper-based tests. The tool has been classroom tested and will be available for potential users. Quizzes are easily constructed, updated or built from test item databases by using this tool. Graphics (using several graphics file formats) for questions and/or answers are easily incorporated as are digital video clips (AVI files). This tool is well suited for pre- and post- exams, student assessment, and self-evaluations.

1. Introduction

Assessment and evaluation (A&E) are important elements in teaching and learning. These activities can, and should, consist of a variety of activities. Good educational practices dictate that assessment be done often and that results quickly be made known to the learner. For example, the current reform movement in calculus was started when assessment techniques other than the traditional problem-based examinations showed that what the students were learning was not what instructors thought they were teaching. The work of Hestenes [1] confirms a similar phenomenon in physics education.

A significant amount of an instructor’s time is usually devoted to constructing and administering assessment instruments and scoring and evaluating the results. Most testing in engineering, as in traditional physics instruction, has traditionally been done on paper and marked by hand. Generally, instructors develop and collect test items in “item pools” or “question banks” [2] over the course of time. From these pools, instructors then select which items to use for particular test. After selecting the items, the instructor constructs the test and copies it for distribution to the students. After completion, the tests must be scored, a process that can stretch the feedback time to days.

+ Partially supported by the National Science Foundation under Cooperative Agreement EEC92-21460 for the Foundation Coalition. * Director of Center for Innovation in Engineering Education

Jung, I., & Ku, H., & Evans, D. L. (1997, June), A Network Based Multimedia Computerized Testing Tool Paper presented at 1997 Annual Conference, Milwaukee, Wisconsin. 10.18260/1-2--6704

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 1997 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015