Asee peer logo

Does Student Crowdsourcing of Practice Questions and Animations Lead to Good Quality Materials?

Download Paper |

Conference

2015 ASEE Annual Conference & Exposition

Location

Seattle, Washington

Publication Date

June 14, 2015

Start Date

June 14, 2015

End Date

June 17, 2015

ISBN

978-0-692-50180-1

ISSN

2153-5965

Conference Session

Computer-Based Tests, Problems, and Other Instructional Materials

Tagged Division

Computers in Education

Page Count

15

Page Numbers

26.561.1 - 26.561.15

DOI

10.18260/p.23899

Permanent URL

https://peer.asee.org/23899

Download Count

512

Request a correction

Paper Authors

biography

Alex Daniel Edgcomb University of California, Riverside

visit author page

Alex Edgcomb finished his Ph.D. in computer science at UC Riverside in 2014. Alex has continued working as a research specialist at UC Riverside with his PhD advisor, studying the efficacy of web-native content for STEM education. Alex also works with Zyante, a startup that develops interactive, web-native textbooks in STEM.

visit author page

biography

Joshua Sai Yuen University of California, RIverside

visit author page

Graduate student at University of California, Riverside

visit author page

biography

Frank Vahid University of California, Riverside

visit author page

Frank Vahid is a Professor of Computer Science and Engineering at the Univ. of California, Riverside. His research interests include embedded systems design, and engineering education. He is a co-founder of zyBooks.com.

visit author page

Download Paper |

Abstract

Experiments in Student Crowdsourcing of Practice Question and Animation CreationModern, web-native textbooks use practice questions and animations to improve studentperformance. Practice questions help student’s learn and digest written material.Animations visually explain concepts. However, creating practice questions andanimations is time intensive. Educational content has been crowdsourced by Wikipediaand Khan Academy using content creators and raters. Content creators produce newcontent; raters sort the content by quality.This paper investigates whether student crowdsourcing can develop good-quality practicequestions and animations. We conducted four experiments to determine whether studentscan: (1) create good-quality practice questions, (2) reliably rate practice questions, (3)create good-quality animations, and (4) reliably rate animations. Experiments (1) and (2)contained 25 participants from an introductory embedded systems course (IES), andexperiments (3) and (4) contained a total of 587 participants from a basic computingcourse (BCC). IES students created 90 practice questions, each rated from 1 to 5 (5 isbest) by a professor. Using the same rating scale, the professor also rated the 19 student-voted-best BCC-student-made animations.4 of the 90 practice questions achieved a rating of 4; none achieved 5. 2 of the 19 BCCstudent-made animations rated by the professor achieved 5. For practice questions, theaverage of the top 20% of student ratings was strongly correlated with the professorrating with R-value = 0.82 (p-value = 0.02). For animations, the average of the top 10%of student ratings was strongly correlated with the professor ratings with R-value = 0.88(p-value < 0.001).Students can reliably rate practice questions and animations. Some students can makegood-quality animations, but students cannot make good-quality practice questions.

Edgcomb, A. D., & Yuen, J. S., & Vahid, F. (2015, June), Does Student Crowdsourcing of Practice Questions and Animations Lead to Good Quality Materials? Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.23899

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015