Salt Lake City, Utah
June 20, 2004
June 20, 2004
June 23, 2004
9.397.1 - 9.397.12
Developing a Method to Measure the Metacognitive Effects Of a Course on Design, Engineering and Technology over Time
Dale Baker, Senay Yasar, & Sharon Robinson Kurpius: College of Education Steve Krause & Chell Roberts: Ira A. Fulton School of Engineering Arizona State University
Measuring and tracking how individuals become aware of their own understanding (metacognition) cannot easily be measured by traditional tests or assessments. Consequently, this paper presents the development and application of a rubric to examine qualitative data that illustrates how graduate students in science education, who were enrolled in a Design, Engineering and Technology (DET) course, became aware of the changes in their understanding of DET. Weekly reflection papers, weekly written pre and post tests and lesson plans were used as data sources. A rubric linking the course outcomes with six major categories (engineering as a design process, gender and diversity, societal relevance of engineering, technical self-efficacy, tinkering self-efficacy and transfer to classroom teaching) was developed to code text. Several passes through the data led to the refinements for the six categories that allowed the coding of almost all of the text. We specifically looked for shifts in understanding over a 15-week period and an awareness that these shifts were taking place (e.g. “It’s not that I had a bad attitude about technology to begin with, rather this class as a whole and our group project has forced me to think about its appropriate applications at the K-12 level.” and “Both technology education papers addressed the difference between technology education and educational technology – two different concepts I had not thought of before”). Our technique allowed us to capture the subtleties of understanding and the progression of metacognition. The rubric demonstrated that the DET course had a strong impact on students thinking about and applying DET to teaching.
Quantitative approaches to assessment can tell you how much, how many or whether group A outperforms group B and provide descriptive statistics for a data set. However, quantitative analysis is unable to tell us why a group of students did well or poorly on an assessment. And, although quantitative techniques can be applied to data sets that are products or artifacts (e.g. a robot designed to perform specific tasks) qualitative techniques provide additional descriptive information as to the quality of the performance. Furthermore qualitative techniques, when used to evaluate large amounts of complex written materials, provide the flavor of the data by identifying salient quotations to support conclusions. Qualitative data, once analyzed with a rubric, can be reduced to numbers by assigning a numerical value to particular types of responses. However, doing so reduces the information available to a researcher.
Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition Copyright © 2004, American Society for Engineering Education
Krause, S. (2004, June), Developing A Method To Measure The Metacognitive Effects Paper presented at 2004 Annual Conference, Salt Lake City, Utah. https://peer.asee.org/13118
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2004 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015