New Orleans, Louisiana
June 26, 2016
June 26, 2016
June 29, 2016
978-0-692-68565-5
2153-5965
Research Methods II: Meeting the Challenges of Engineering Education Research
Educational Research and Methods
15
10.18260/p.26851
https://peer.asee.org/26851
784
Dr. Canney teaches civil engineering at Seattle University. His research focuses on engineering education, specifically the development of social responsibility in engineering students. Other areas of interest include ethics, service learning, and the role of the public in engineering decisions. Dr. Canney received bachelors degrees in Civil Engineering and Mathematics from Seattle University, a masters in Civil Engineering from Stanford University with an emphasis on structural engineering, and a PhD in Civil Engineering from the University of Colorado Boulder.
Angela Bielefeldt is a professor at the University of Colorado Boulder in the Department of Civil, Environmental, and Architectural Engineering (CEAE). She serves as the ABET assessment coordinator for the department. Professor Bielefeldt is the faculty director of the Sustainable By Design Residential Academic Program, a living-learning community where interdisciplinary students learn about and practice sustainability. Bielefeldt is also a licensed P.E. Professor Bielefeldt's research interests in engineering education include service-learning, sustainable engineering, social responsibility, ethics, and diversity.
Greg Rulifson is a Civil Engineering doctoral candidate focused on qualitative engineering education research while also completing the Engineering in Developing Communities certificate. Greg earned his bachelor's degree in Civil Engineering with a minor in Global Poverty and Practice from UC Berkeley where he acquired a passion for using engineering to facilitate developing communities’ capacity for success. He earned his master's degree in Structural Engineering and Risk Analysis from Stanford University. His upcoming dissertation will focus on how student's connections of social responsibility and engineering change throughout college and what are the social responsibility-related reasons some students choose to leave engineering.
This research paper explores the use of interviews as validity evidence for a survey instrument, the Engineering Professional Responsibility Assessment (EPRA). The EPRA tool uses 50 Likert items to assess engineering students’ attitudes toward personal and professional social responsibility. Validity evidence for EPRA based on internal structure has been previously examined using structural equation modeling and multidimensional item response theory; both showed strong evidence. This paper expands the body of validity evidence, specifically evidence based on relations to other variables: interview responses.
Data came from interviews with 24 engineering students after they had completed the EPRA survey. To compare interview data to Likert items, a coding rubric correlating to Likert scores was developed with feedback from engineering education experts. Once language for the rubric was solidified, two researchers coded each interview, resulting in a score for each dimension for each participant. Interview and survey scores were compared using Spearman’s rank order correlation coefficient and the Wilcoxon signed-rank test. Results showed that four of the 24 respondents had significant correlation (p<0.05) and two had suggestive correlation (p<0.10) between their scored interviews and EPRA scores across all dimensions. Eighteen respondents rejected the hypothesis of difference (p>0.05). Across the eight dimensions, three had strong correlation (p<0.05) and three rejected the hypothesis of differences (p>0.05). Only one dimension showed both correlation and a rejection of difference. The process of using interview data as evidence of validity for a survey instrument is appealing. Surveys tend to compress complex issues into bin-able categories, perhaps oversimplifying the nuances of attitudes and beliefs. This exploration steps through one way in which this may be done, by coding interviews using a rubric and comparing scores with survey results. Suggestions for producing better results in future studies, such as more targeted interviews, are given.
Canney, N. E., & Bielefeldt, A. R., & Rulifson, G. (2016, June), Exploring Interviews as Validity Evidence for the Engineering Professional Responsibility Assessment Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.26851
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015