Asee peer logo

Measuring Adaptive Expertise in Engineering Education

Download Paper |


2016 ASEE Annual Conference & Exposition


New Orleans, Louisiana

Publication Date

June 26, 2016

Start Date

June 26, 2016

End Date

June 29, 2016





Conference Session

Student Evaluation in Design Education

Tagged Division

Design in Engineering Education

Tagged Topic


Page Count




Permanent URL

Download Count


Request a correction

Paper Authors


Olga Pierrakos James Madison University

visit author page

Olga Pierrakos is a Founding Faculty and Associate Professor in the Department of Engineering at James Madison University. She is currently a Program Director at the National Science Foundation in the Division of Undergraduate Education. Her expertise and interests focus on diversity and inclusion, engineer identity, PBL, innovative learning-centered pedagogies, assessment of student learning, engineering design, capstone design, etc. She also conducts research in cardiovascular fluid mechanics and sustainable energy technologies. She holds a BS and MS in Engineering Mechanics and a PhD in Biomedical Engineering from Virginia Tech.

visit author page


Robin Dawn Anderson James Madison University

visit author page

Robin D. Anderson serves as the Academic Unit Head for the Department of Graduate Psychology at James Madison University. She holds a doctorate in Assessment and Measurement. She previously served as the Associate Director of the Center for Assessment and Research Studies at JMU. Her areas of research include assessment practice and engineering education research.

visit author page


Cheryl Alyssa Welch

visit author page

Alyssa Welch is a Psychological Sciences master’s student in the concentration of Experimental Psychology, and a Graduate Teaching Assistant in the Department of Psychology at James Madison University. She received her BA in Psychology and Anthropology from James Madison University as well. As a graduate assistant, she is currently working closely with the engineering department performing interdisciplinary research with the goal of improving engineering programs at the undergraduate level. Her research interests include adaptive expertise, metacognition, social influence, cognitive dissonance, and social neuroscience.

visit author page

Download Paper |


With global competitiveness, outsourcing, and an increase in production of overseas engineers, undergraduate engineering programs in the United States are increasingly under pressure to better prepare students for this evolving workforce. With the rapid pace of technological change, the environment that future engineers will face will require the ability to adapt quickly and engage in novel problem solving. Current assessment measures of student learning, such as standardized test scores and GPA, may not be the predictors of academic and career success that they were once believed to be. This is because they do not measure students’ adaptive problem solving abilities. Recent studies have attempted to define an adaptive expertise skill set that, when assessed, would provide engineering programs with a better predictor of student success. Adaptive expertise is defined as the ability to apply knowledge, gained through prior experiences, to novel situations in which key information is missing. Researchers have attempted to measure adaptive expertise through a variety of methods including interviews, think-alouds, and beliefs surveys. One of the most commonly used measures is Fisher and Peterson’s Adaptive Expertise Beliefs survey (2001).

As part of a larger post-semester survey, researchers at a mid-Atlantic university administered Fisher and Peterson's Adaptive Expertise Beliefs survey (2001) to students enrolled in a senior design capstone course. Two sections of this course were included in data collection. Instructors taught one section of the course using methods based on the principles of adaptive expertise, while the other course section involved the use of the traditional lecture-based method of instruction. Results indicated a significant difference in overall adaptive expertise belief scores. However, researchers did not find significant differences between the two groups on any of the individual Fisher and Peterson subscales, making interpretations regarding the impact of instructional methodology more difficult and less useful to the instructors.

This paper argues for the development of a more sensitive, more direct measurement of adaptive expertise. Given the importance of adaptive expertise to the emerging engineer, it is imperative that a measure of adaptive expertise be able to provide meaningful information regarding how students perform on the various dimensions of the construct. In addition, Fisher and Peterson's survey is an indirect measure that assesses what students report, not what they can actually do or demonstrate. A direct measure of adaptive expertise skills would allow for the development of educational interventions to promote adaptive expertise skills (not beliefs). In particular, such interventions would promote the development of more complex problem solving, better preparing future engineers for the fast-paced and constantly evolving environment of the workplace.

Pierrakos, O., & Anderson, R. D., & Welch, C. A. (2016, June), Measuring Adaptive Expertise in Engineering Education Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.25690

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015