Seattle, Washington
June 14, 2015
June 14, 2015
June 17, 2015
978-0-692-50180-1
2153-5965
Computer-Based Tests, Problems, and Other Instructional Materials
Computers in Education
15
26.561.1 - 26.561.15
10.18260/p.23899
https://peer.asee.org/23899
563
Alex Edgcomb finished his Ph.D. in computer science at UC Riverside in 2014. Alex has continued working as a research specialist at UC Riverside with his PhD advisor, studying the efficacy of web-native content for STEM education. Alex also works with Zyante, a startup that develops interactive, web-native textbooks in STEM.
Frank Vahid is a Professor of Computer Science and Engineering at the Univ. of California, Riverside. His research interests include embedded systems design, and engineering education. He is a co-founder of zyBooks.com.
Experiments in Student Crowdsourcing of Practice Question and Animation CreationModern, web-native textbooks use practice questions and animations to improve studentperformance. Practice questions help student’s learn and digest written material.Animations visually explain concepts. However, creating practice questions andanimations is time intensive. Educational content has been crowdsourced by Wikipediaand Khan Academy using content creators and raters. Content creators produce newcontent; raters sort the content by quality.This paper investigates whether student crowdsourcing can develop good-quality practicequestions and animations. We conducted four experiments to determine whether studentscan: (1) create good-quality practice questions, (2) reliably rate practice questions, (3)create good-quality animations, and (4) reliably rate animations. Experiments (1) and (2)contained 25 participants from an introductory embedded systems course (IES), andexperiments (3) and (4) contained a total of 587 participants from a basic computingcourse (BCC). IES students created 90 practice questions, each rated from 1 to 5 (5 isbest) by a professor. Using the same rating scale, the professor also rated the 19 student-voted-best BCC-student-made animations.4 of the 90 practice questions achieved a rating of 4; none achieved 5. 2 of the 19 BCCstudent-made animations rated by the professor achieved 5. For practice questions, theaverage of the top 20% of student ratings was strongly correlated with the professorrating with R-value = 0.82 (p-value = 0.02). For animations, the average of the top 10%of student ratings was strongly correlated with the professor ratings with R-value = 0.88(p-value < 0.001).Students can reliably rate practice questions and animations. Some students can makegood-quality animations, but students cannot make good-quality practice questions.
Edgcomb, A. D., & Yuen, J. S., & Vahid, F. (2015, June), Does Student Crowdsourcing of Practice Questions and Animations Lead to Good Quality Materials? Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.23899
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015