Montreal, Canada
June 16, 2002
June 16, 2002
June 19, 2002
2153-5965
16
7.310.1 - 7.310.16
10.18260/1-2--10043
https://peer.asee.org/10043
720
Main Menu Session 2630
Comparing Design Team Self-Reports with Actual Performance: Cross-Validating Assessment Instruments
Robin Adams1, Pimpida Punnakanta 1, Cynthia J. Atman 1,2, Craig D. Lewis 1 Center for Engineering Learning and Teaching 2 Department of Industrial Engineering University of Washington
Assessing student learning of the engineering design process is challenging. Students’ ability to answer test questions about the design process or record their design activities may differ significantly from their actual performance in solving “messy” open-ended problems. In the Pacific Northwest, multi-university participants in a National Science Foundation supported project (Transferable Integrated Design Engineering Education, TIDEE) have implemented and disseminated a Mid-Program Assessment instrument for assessing engineering student design competency. One part of the instrument requires student teams to document (e.g., self-report) their design decisions and processes while engaged in a design task. These written self-reports are scored using a rubric that has demonstrated a high inter-rater reliability. We are interested in comparing the scores derived from these self- reports with measures of actual design performance. Our research method for analyzing design performance is verbal protocol analysis. In this study, eighteen teams of students (2-6 students per team) from four different institutions were videotaped as they completed the TIDEE Mid-Program Assessment. In this paper we provide 1) a description of the assessment instrument, 2) our research methods for assessing the validity of the instrument, 3) examples of comparing self-reports to performance, and 4) a summary of our findings. We conclude with a discussion of the strengths and weaknesses of this study, as well as implications for teaching and assessing engineering student design competency.
Introduction
To compete in an increasingly global economy, the education of tomorrow’s engineers needs to emphasize competency in the solving of open-ended engineering design problems. This theme is evident in the growing level of collaboration among accrediting agencies, industry, and federal funding agencies to support research on the assessment of student learning and to encourage excellence in curriculum and pedagogy that provide an exposure to engineering practice 1-3. Also, the implementation of the new ABET EC 2000 criteria4 makes it necessary for engineering programs to identify, assess, and demonstrate evidence of design competency. These changes in accreditation have expanded a goal of assessing student learning outcomes to making judgments about curricula and instructional practices with an aim towards continual improvement.
Assessing student learning of the engineering design process is particularly challenging, and efforts to assess design competency are varied 5-6. Examples of using surveys include self- assessments of abilities and knowledge7-8 and peer-based instruments where students assess the competency of their peers9-10. Examples of performance-based assessments include: juries where
Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition Copyright © 2002, American Society for Engineering Education
Main Menu
Adams, R., & Punnakanta, P., & Lewis, C. D., & Atman, C. (2002, June), Comparing Design Team Self Reports With Actual Performance: Cross Validating Assessment Instruments Paper presented at 2002 Annual Conference, Montreal, Canada. 10.18260/1-2--10043
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2002 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015