Asee peer logo

Board 371: Relationships Between Metacognitive Monitoring During Exams and Exam Performance in Engineering Statics

Download Paper |

Conference

2023 ASEE Annual Conference & Exposition

Location

Baltimore , Maryland

Publication Date

June 25, 2023

Start Date

June 25, 2023

End Date

June 28, 2023

Conference Session

NSF Grantees Poster Session

Tagged Topic

NSF Grantees Poster Session

Page Count

9

DOI

10.18260/1-2--43039

Permanent URL

https://peer.asee.org/43039

Download Count

95

Request a correction

Paper Authors

biography

Chris Venters East Carolina University

visit author page

Chris Venters is an Assistant Professor in the Department of Engineering at East Carolina University in Greenville, North Carolina, USA. He teaches introductory courses in engineering design and mechanics and upper-level courses in fluid mechanics. He earned his Ph.D. in Engineering Education from Virginia Tech in 2014, and his research primarily focuses on conceptual understanding in engineering mechanics courses. He received his M.S. in Aerospace Engineering from Virginia Tech and his B.S. in Aerospace Engineering from North Carolina State University.

visit author page

biography

Saryn Goldberg Hofstra University

visit author page

Dr. Saryn R. Goldberg is an Associate Professor of Mechanical Engineering in Hofstra University’s School of Engineering and Applied Sciences. Dr. Goldberg received her Sc.B. in Engineering with a focus on materials science from Brown University, her M.S. degree in Biomedical Engineering with a focus on biomaterials from Northwestern University, and her Ph.D. in Mechanical Engineering with a focus on biomechanics from Stanford University. At Hofstra she teaches courses in mechanical engineering and materials science. Her research in engineering education focuses on the use of student question-asking to promote metacognition. She is a member of the Society of Women Engineers and the American Society of Engineering Education.

visit author page

biography

Amy Masnick Hofstra University

visit author page

Dr. Amy Masnick is an Associate Professor of Psychology at Hofstra University. Dr. Masnick received both her B.S. and Ph.D. in Human Development at Cornell University. At Hofstra she teaches courses in introductory psychology, research methods, cognitive

visit author page

author page

Kaelyn Marks Hofstra University

author page

Kareem Panton Hofstra University

Download Paper |

Abstract

Our NSF-DUE-funded project studies whether providing students with training and practice writing questions about their confusions in an undergraduate engineering statics course supports improved course performance and metacognitive awareness.

As a part of this study we investigate relationships between metacognitive monitoring during statics exams and actual exam performance. Metacognitive monitoring is the process of observing one’s understanding and approach while completing a learning task. In this study, the learning tasks are semester and final exams. One way to assess students’ metacognitive monitoring is to measure students’ ability to accurately predict their score on an assessment of their understanding. Specifically, on each problem on each exam throughout the semester we asked students to predict their score out of a known total point value. To measure a students’ metacognitive skill, a bias index (Schraw, 2009) was calculated for each student on each exam. This index demonstrates the difference between a student's confidence ratings and performance scores, and indicates whether a student is "underconfident", i.e. performs better than they expected or "overconfident", i.e., performed worse than expected (Schraw, 2009). To investigate group differences in bias, the absolute value of the indices was calculated. This allowed the corresponding magnitudes of the bias indices to be compared and averaged.

Preliminary analysis indicates that students who earned a passing grade on an exam (above 60%) had statistically lower bias scores. This indicates that students who earned a passing grade were more likely to accurately predict their exam performance, exhibiting more effective metacognitive monitoring of their own understanding.

We also have found a significant increase in students’ ability (regardless of passing or failing grades) to more accurately predict their scores between the first and second exams, which we attribute to better understanding the learning task and how performance will be evaluated. No differences have yet been found in students’ ability to better predict their test scores between the second exam and the final exam.

Venters, C., & Goldberg, S., & Masnick, A., & Marks, K., & Panton, K. (2023, June), Board 371: Relationships Between Metacognitive Monitoring During Exams and Exam Performance in Engineering Statics Paper presented at 2023 ASEE Annual Conference & Exposition, Baltimore , Maryland. 10.18260/1-2--43039

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2023 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015