June 14, 2015
June 14, 2015
June 17, 2015
Educational Research and Methods
26.1284.1 - 26.1284.14
Psychometric Analysis of Residence and MOOC Assessment DataUndergraduate STEM programs are faced with the daunting challenge of managing instructionand assessment for classes that enroll thousands of students per year, and the bulk of studentassessment is often determined by multiple choice tests. Instructors try to monitor the reliabilitymetrics and diagnostics for item quality, but rarely is there a more formal evaluation of thepsychometric properties of these assessments. We see an opportunity to have a major impacton undergraduate science instruction by incorporating more rigorous measurement models fortesting, and using them to assist instructional goals and assessment.We propose to apply item response theory to analyze the tests from recent years inundergraduate STEM classes (physics, chemistry and statistics) involving tens of thousands ofstudents. We will evaluate whether the tests are equally informative across the gradedistribution, and we will assess the dimensionality of the test data to infer separable aspects ofachievement.We will also examine data from two large MOOCs conducted by University X. Our researchquestions here will be whether these assessments exhibit similar properties in terms informationcontent, population heterogeneity, and factors affecting performance. The student populations inMOOCs are an order of magnitude larger than in introductory STEM classes, and they areoperating under a very different educational context.Large undergraduate courses (both residence and MOOC) deliver millions of multiple choicetests each year. These tests determine grades and student progress into further STEM study,and are also the most common outcome measure for evaluation educational interventions.These courses should use the same advanced measurement models used to develop so-called“high stakes” tests, such as college admissions tests. A rigorous assessment program canserve three important functions: (1) to improve assessment by allowing the construction ofpsychometrically sound tests, and by facilitating standardization across different sections, anddifferent years of the same course; (2) to offer enhanced learning opportunities for studentsthrough computer adaptive study materials with feedback; and (3) to open up opportunities forresearch on methods for enhancing student learning and engagement, and for investigatingfairness in assessment.
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015