Portland, Oregon
June 23, 2024
June 23, 2024
June 26, 2024
NSF Grantees Poster Session
9
10.18260/1-2--46847
https://peer.asee.org/46847
59
Kaelyn Marks is a clinical psychology trainee within Hofstra University’s Clinical Psychology Ph.D. program. She received her B.A. in Applied Psychology from University of Massachusetts Amherst, an M.S. in Psychological Science from SUNY New Paltz, and an M.A. in Clinical Psychology from Hofstra University. At Hofstra she teaches courses in psychopathology and research methods.
Dr. Saryn R. Goldberg is an Associate Professor of Mechanical Engineering in Hofstra University’s DeMatteis School of Engineering and Applied Sciences. Dr. Goldberg received her Sc.B. in Engineering with a focus on materials science from Brown University, her M.S. degree in Biomedical Engineering with a focus on biomaterials from Northwestern University, and her Ph.D. in Mechanical Engineering with a focus on biomechanics from Stanford University. At Hofstra she teaches courses in mechanical engineering and materials science. Her research in engineering education focuses on the use of student question-asking to promote metacognition.
Chris Venters is an Associate Professor in the Department of Engineering at East Carolina University in Greenville, North Carolina, USA. He teaches introductory courses in engineering design and mechanics and upper-level courses in fluid mechanics. He earned his Ph.D. in Engineering Education from Virginia Tech in 2014, and his research primarily focuses on conceptual understanding in engineering mechanics courses. He received his M.S. in Aerospace Engineering from Virginia Tech and his B.S. in Aerospace Engineering from North Carolina State University.
Dr. Amy Masnick is a Professor of Psychology at Hofstra University. Dr. Masnick received both her B.S. and Ph.D. in Human Development at Cornell University. At Hofstra she teaches courses in introductory psychology, research methods, cognition, and child development.
A primary goal of our DUE-funded project is to examine the quality of questions about course content asked by students enrolled in a statics course. We have developed a classroom-based intervention that provides statics students with training in the utility of question-asking and with frequent opportunities to submit written questions about their current confusions in the course. One goal of our project is to evaluate whether and how the nature and quality of student questions changes throughout the semester. The taxonomy provides a means for evaluating these changes.
Our original taxonomy was based on one developed for use with physics students (Harper et al., 2003). The taxonomy was approximately hierarchical, in which higher-numbered categories roughly represented metacognitively more sophisticated questions. Previously, we shared our process for creating—and subsequently modifying—the taxonomy for use in categorizing the quality of questions students ask about statics (reference to author work removed for blind review). While our modified taxonomy increased interrater reliability between faculty raters classifying student questions, a challenge remained pertaining to questions which could potentially fall into more than one category. Consequently, we have considered the utility of developing a categorization system designed with the expectation that questions will fall into more than one category. This approach alleviates some challenges associated with strictly sorting questions based on the type of knowledge required to answer the question, which becomes difficult when answers require multiple or overlapping knowledge types. This new approach also allows us to consider additional question features (e.g., closed- or open-ended, correct or incorrect use of statics vocabulary) that can more richly evaluate question quality.
In this paper, we share our progress on developing a revised taxonomy that captures multiple dimensions of question quality. Specifically, we describe our process of creating the multi-dimensional taxonomy, in which some dimensions are predefined using our prior work on question categorization, while other dimensions are explored via employing an inductive coding approach to discover newly emerging themes within student questions. We show the results of using the new taxonomy to categorize a set of student questions, and we compare the results from our previous taxonomy to illustrate differences between the two approaches.
Marks, K., & Goldberg, S., & Venters, C., & Masnick, A. M. (2024, June), Board 273: Exploring a Multi-dimensional Characterization of Statics Students’ Questions Paper presented at 2024 ASEE Annual Conference & Exposition, Portland, Oregon. 10.18260/1-2--46847
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2024 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015