Indianapolis, Indiana
June 15, 2014
June 15, 2014
June 18, 2014
2153-5965
Computers in Education
17
24.565.1 - 24.565.17
10.18260/1-2--20456
https://peer.asee.org/20456
698
Mark Urban-Lurain is an associate professor and the associate director of the Center for Engineering Education Research at Michigan State University. He is the lead PI and project director of the AACR project.
Dr. Urban-Lurain is responsible for teaching, research, and curriculum development, with an emphasis on engineering education and, more broadly, STEM education. His research interests are in theories of cognition, how these theories inform the design of instruction, how we might best design instructional technology within those frameworks, and how the research and development of instructional technologies can inform our theories of cognition. He is also interested in preparing future STEM faculty for teaching, incorporating instructional technology as part of instructional design, and STEM education improvement and reform.
Melanie Cooper is the Lappan-Phillips Professor of Science Education and a professor of chemistry at Michigan State University. She received her B.S., M.S., and Ph.D. in chemistry from the University of Manchester, England. Her research has focused on improving teaching and learning in large enrollment general and organic chemistry courses at the college level, and she is a proponent of evidence-based curriculum reform, for example the NSF-supported “Chemistry, Life, the Universe & Everything.” She also has developed technological approaches to formative assessment that can recognize and respond to students' free-form drawings, such as the beSocratic system. She is a Fellow of the American Chemical Society and the American Association for the Advancement of Science, and a member of the leadership team for the Next Generation Science Standards (NGSS) and the National Research Council's advisory board on science education. She has received a number of awards including the ACS Award for Achievement in Research on Teaching and Learning 2014, the Norris award for Outstanding Achievement in teaching of chemistry in 2013, and the 2010-2011 Outstanding Undergraduate Science Teacher Award from the Society for College Science Teaching.
Kevin Haudek is a research specialist in the Center for Engineering Education Research at Michigan State University. He is a member of the AACR research group. His research interests are in student understanding and application of chemistry in biological contexts, and strategies to increase student writing in undergraduate STEM courses.
Jenny Knight is a senior instructor in the department of molecular, cellular and developmental biology at the University of Colorado, Boulder, where she also has been the departmental coordinator of the Carl Wieman Science Education Initiative. She also directs the National Academies' regional Mountain West Summer Institute on Undergraduate Education in Biology. Her research focuses on developing meaningful biology concept assessments, uncovering student misconceptions in genetics and introductory biology, and most recently, on the nature of in-class student discussions and their impact on student reasoning and learning.
Paula Lemons is an assistant professor in the department of biochemistry and molecular biology at the University of Georgia. She does research on problem solving among college biology students. She also investigates the process by which college biology instructors make changes to their teaching.
Carl T. Lira teaches thermodynamics at all levels, chemical kinetics, and material and energy balances. He has been recognized with the Amoco Excellence in Teaching Award, and multiple presentations of the MSU Engineering College Withrow Teaching Excellence Award. He is co-author of a widely used textbook, Introductory Chemical Engineering Thermodynamics. He has active research in phase equilibria, kinetics, alternative fuels, and reactive distillation. He has M.S. and Ph.D. degrees from the University of Illinois, Champaign-Urbana, and a B.S. from Kansas State University.
Ross Nehm is an associate professor in the department of ecology and evolution, and core faculty in the Ph.D. program in science education at Stony Brook University (SUNY). Dr. Nehm has authored or co-authored 50 journal articles and book chapters and presented more than 100 conference talks and papers. Dr. Nehm currently serves on the editorial boards of the Journal of Research in Science Teaching, the Journal of Science Teacher Education, and the Journal of Science Education and Technology. He also serves on the advisory boards of several national science education projects, and has served as panel chair for several NSF programs. For several years he has served on the NARST Outstanding Dissertation committee. Dr. Nehm's major awards include a CAREER award from the National Science Foundation, a teaching award from Berkeley, and a mentoring award from CUNY. In 2013-14 Dr. Nehm was named an Education Mentor in the Life Sciences by the National Academies.
Dr. Mary Anne Sydlik is the director of the Science and Mathematics Program Improvement (SAMPI) Center, an outreach division of the Mallinson Institute For Science Education at Western Michigan University. SAMPI specializes in evaluation, research, and technical assistance for K-12 schools and higher education institutions. She is the external evaluator for the project.
Dr. Sydlik's interests are in supporting efforts to improve the educational experiences and outcomes of undergraduate STEM students. She has been the lead external evaluator for a number of STEM and NSF-funded projects, including an NSF TUES III, a WIDER project, an NSF EEC project through WGBH Boston, an NSF RET project, an S-STEM project, a CPATH project, and a CCLI Phase II project. She also currently serves as the internal evaluator for WMU’s Woodrow Wilson Fellows project and the institution’s Howard Hughes Medical project, and has contributed to other current and completed evaluations of NSF-funded projects carried out at SAMPI.
Work-in-Progress: Expanding a National Network for Automated Analysis of Constructed Response Assessments to Reveal Student Thinking in STEMImproving STEM education requires valid and reliable instruments for providing insight intostudent thinking. Constructed response (CR) assessments reveal more about student thinkingand the persistence of misconceptions than do multiple-choice questions, but require moreanalysis on the part of educators.We have developed constructed response versions of well-established conceptual assessmentinventories and created computer automated analysis resources that predict human ratings ofstudent writing about these topics in introductory STEM courses. The research uses a two-stage,feature-based approach to automated analysis of constructed response assessments. First, wedesign items to identify important disciplinary constructs based on prior research. The items areadministered via online course management systems where students enter responses. We uselexical analysis software to extract key terms and scientific concepts from the students’ writing.These terms and concepts are used as variables for statistical classification techniques to predictexpert ratings of student responses. The inter-rater reliability (IRR) between automatedpredictions and expert human raters is as high as IRR between human experts.We recently received another round of funding to extend our work to provide an onlinecommunity where instructors may obtain, score and contribute to the library of items andresources necessary for their analyses.The specific goals of this project are to:1 Create a community web portal for the assessments to expand and deepen collaborations among STEM education researchers. This portal will provide the infrastructure for expanding the community of researchers and supporting the adoption and implementation of the innovative instructional materials by instructors at other institutions.2 Transport the innovations by providing instructors professional development and support to use the assessments. This includes information about common student conceptions revealed by the questions, instructional materials for addressing conceptual barriers, and the opportunity to join a community of practitioners who are using the questions and exchanging materials, and long-term, ongoing support.3 Expand the basic research to create and validate items in introductory chemistry, chemical engineering, and statistics.4 Engage in ongoing project evaluation for continuous quality improvement and to document the challenges and successes the project encounters.5 Lay the foundation for sustainability by providing interfaces for e-text publishers, Learning Management System (LMS) vendors and Massively Open Online Courses (MOOCs) as potential revenue streams to operate and maintain the online infrastructure.This paper will provide an overview of the goals of the project and introduce the opportunities toparticipate in the development of a national network of faculty using these techniques.
Urban-Lurain, M., & Cooper, M. M., & Haudek, K. C., & Kaplan, J. J., & Knight, J. K., & Lemons, P. P., & Lira, C. T., & Merrill, J. E., & Nehm, R., & Prevost, L. B., & Smith, M. K., & Sydlik, M. (2014, June), Expanding a National Network for Automated Analysis of Constructed Response Assessments to Reveal Student Thinking in STEM Paper presented at 2014 ASEE Annual Conference & Exposition, Indianapolis, Indiana. 10.18260/1-2--20456
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2014 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015