Asee peer logo

What Online Quizzing Can Tell Us About Our Students

Download Paper |


2005 Annual Conference


Portland, Oregon

Publication Date

June 12, 2005

Start Date

June 12, 2005

End Date

June 15, 2005



Conference Session

Assessing with Technology

Page Count


Page Numbers

10.1468.1 - 10.1468.9



Permanent URL

Download Count


Request a correction

Paper Authors

author page

Andrew J. Wiesner

author page

Jonathan P. Mathews

author page

Sarma Pisupati

Download Paper |

NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

What Online Quizzing Can Tell Us About Our Students

Jonathan P. Mathews*, Sarma V. Pisupati*, and Andrew Wiesner±

*Energy and Geo-Environmental Engineering Department and John A. Dutton e-Education Institute College of Earth & Mineral Sciences ± Schreyer Institute for Teaching Excellence The Pennsylvania State University University Park, PA 16802 | |


Computer based quiz and exam results from a large enrollment general education class were analyzed to determine what an in-depth analysis of the quizzing data could tell us about our students. Analysis of variance methods were implemented to study the effects of gender, grade point average, and class standing on overall test performance as well as the multiple choice and short answer sections of the tests. The effects these factors held on student quiz behavior in relation to their exam scores was also evaluated. Important considerations were given to behavioral differences based on gender and GPA. The “B” or better students were less likely to miss the weekly quiz (3%) than the lower GPA student (20%). Students were likely to at least match their incoming GPA with course grade in this general education course. Multiple comparisons for essay time determined that significant differences were present between the A, B and the C, D, F students, with the better performing students taking more time to complete this portion of the exam. Within these timed exams, there was no significant difference in quantity typed between the sexes, 752 words for males in comparison to 742 words (35 minute exam with instructions to answer 4 of the 6 questions.) Evaluating short answer essay questions against “multiple choice” scores within three timed exams indicated that 71.5%, 90.6%, and 77.2% of the students did better in the essay component in comparison to the multiple choice component for the 3 exams. For the “multiple choice” portion of the exam, surprisingly with 38 questions only14% male and 16% of the female students used >20 minutes in answering the questions. The average time required being 15 of the 25 available minutes. An analysis of the essay portion of the exam indicates that both genders submitted the exam after an average of 32 minutes, rather than utilize the full 35 minutes. Approximately 20% of the class, and each gender submitted the exam in less than 30 minutes. Most of the students were tenacious in obtaining the 100% score for the weekly quiz, despite the value being 1% of the grade. Hence, this approach was considered useful for a directed review of the material.

Wiesner, A. J., & Mathews, J. P., & Pisupati, S. (2005, June), What Online Quizzing Can Tell Us About Our Students Paper presented at 2005 Annual Conference, Portland, Oregon. 10.18260/1-2--14513

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2005 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015