Salt Lake City, Utah
June 23, 2018
June 23, 2018
July 27, 2018
Educational Research and Methods
Developing adaptive expertise in engineering students is of increasing importance as we strive to educate students who can respond to changes in the field by flexibly applying their theoretical knowledge and prior experience to new situations [Peng et al., 2014; Cantor et al., 2015; Bruenger, 2015]. To understand the extent to which different curricular and educational practices meet this aim, a means of measuring this complex skillset is necessary. In this paper, we introduce a revised Adaptive Expertise survey, building on Fisher and Peterson’s (2001) 42-item scale. Our 20-item instrument reflects advances in adaptive expertise research from both the learning sciences and human resources, literatures and has been developed through cognitive interviews, pilot testing, and exploratory factor analysis. Our revision of Fisher and Peterson’s 2001 scale followed our use of that instrument on a sample of 4,704 undergraduate students as a pre- and post-test measure to understand the effect of reflective writing exercises on their development of adaptive expertise. A Cronbach’s alpha of the scale produced reliability scores for the test’s sub-scales in the poor-acceptable range (multiple perspectives, 0.68; meta-cognition, 0.68; goals and beliefs, 0.73; epistemology, 0.5) placing them in the poor-acceptable range [Kline, 2000; Tavakol & Dennick, 2011]. A factor analysis failed to replicate the factor structure outlined by Fisher and Peterson. Using current theorizing and research on adaptive expertise, we designed a revised scale with items addressing innovative skills, domain skills, metacognition [Bohle Carbonell et al., 2016], self-efficacy, and resilience. The revised scale included modified items from Fisher and Peterson’s 2001 survey, as well as additional items of our own construction, and several items based on work by van der Heijden (2000), Charbonnier-Voiirin et al. (2012), and Bohle Carbonell et al. (2016). The first step of the process of refining the scale was to conduct nine cognitive interviews with undergraduate students who were currently on co-operative education placements. These interviews were then used to clarify the wording of scale items, reduce duplication, and remove or refine items which were interpreted ambiguously by students, resulting in a 39-item scale. We piloted that revised scale with 347 undergraduate students who were in co-operative education placements. Following an exploratory factor analysis which produced a four-factor model with strong face validity and significant goodness-of-fit for a scale including 20 items, we introduced the survey into a longitudinal research project on co-operative education. We are currently using this survey to collect pre- and post-test responses from 1,009 undergraduate students in all majors who are in co-operative education placements. Once this data has been collected, we will be able to provide additional insights into the reliability and validity of this new scale through additional factor analysis, including a confirmatory factor analysis. We also plan to present an analysis of how a students’ year of study, whether the student is an engineering major or not, the number of cooperative education placements, and gender may affect survey responses on this revised scale.
Ferguson, J. H., & Lehmann, J., & Zastavker, Y. V., & Chang, S., & Higginson, R. P., & Talgar, C. P. (2018, June), Adaptive Expertise: The Development of a Measurement Instrument Paper presented at 2018 ASEE Annual Conference & Exposition , Salt Lake City, Utah. https://peer.asee.org/29752
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2018 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015