June 24, 2017
June 24, 2017
June 28, 2017
NSF Grantees Poster Session
Metacognition, defined as the knowledge and regulation of one’s own cognitive processes, is critically important to student learning and particularly instrumental in problem-solving. Despite the importance of metacognition, much of the research has occurred in controlled research settings such that much less is known about how to help students develop metacognitive skills in classroom settings. Further, there are significant bodies of research on the role of metacognition in writing and solving math problems, but little work has been done on the role of metacognition within engineering disciplines.
The purpose of this project is to generate transferable tools which can be used to teach and evaluate undergraduate engineering students’ metacognitive skills. This present paper reports on our development of a metacognitive indicator rubric for assessing students’ metacognitive processes and tracking their growth. Up to this point in the project we have created a six-module metacognitive intervention, piloted the intervention in a sophomore engineering course at a small private undergraduate-focused institution and translated the intervention to two more engineering education contexts including a first-year and upper-level engineering course each at different universities. Each module is made up of paired pre-class video, in-class activity, and post-class assignment elements. The videos provide a general view of metacognition situated within a STEM higher education context, while the in-class activities and post-class assignments are specialized for the particular context (e.g., problem solving, lab, or project based courses).
To develop a metacognitive indicator rubric, we analyzed student responses to the metacognitive module assignments collected during intervention pilot. We tested and refined the indicators using student data from subsequent implementations. Later we will work with instructors to ensure their utility and ease-of-use. In developing the indicator rubric, we first identified a question from each assignment that exemplified the main purpose of each module. Then all of the student responses from that question were pooled and ranked on a “low”, “medium”, or “high” level of metacognitive processing for that question. Since each module had a main topic, students responses with at least a mention of the topic were ranked as a “medium”. A “high” level answer related topics from the current module to ones they had seen before and made plans for implementing their new knowledge. A “low” level answer generally revealed that the student made little attempt to engage in the metacognition module. As such, the metacognitive indicator rubric serves as a translation of common student behavior to the formal elements of metaconition.
The metacognitive indicator rubric is designed to assist instructors in assessing how their students are engaging in the metacognition modules and in giving students specific and actionable feedback to improve their approaches to learning in their course. The rubric provides specific examples of student behavior in the students’ own words categorized by level and metacognitive dimension. As students progress through the modules, instructors will be able to track individual students’ metacognitive growth and target their feedback accordingly, praising progress and gently challenging less effective approaches to learning.
Cunningham, P., & Matusovich, H. M., & Hunter, D. N., & Blackowski, S. A., & Bhaduri, S. (2017, June), Board # 28 : Beginning to Understand Student Indicators of Metacognition Paper presented at 2017 ASEE Annual Conference & Exposition, Columbus, Ohio. 10.18260/1-2--27820
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2017 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015