Seattle, Washington
June 14, 2015
June 14, 2015
June 17, 2015
978-0-692-50180-1
2153-5965
Design in Engineering Education
23
26.692.1 - 26.692.23
10.18260/p.24029
https://peer.asee.org/24029
793
Joy Adams is the Program Manager for the Multidisciplinary Design Program at the University of Michigan. In this role, she focuses on Corporate Sponsored Projects, Communications and Student Performance Appraisals. She has seven years of diverse professional Human Resources experience, including prior roles in Training & Development, Campus Recruiting and Talent Management/Leadership Development at various Fortune 500 firms.
Mical DeGraaff is a recent graduate of the University of Michigan, where she received her Master of Arts in Higher Education Administration. As a student, Mical worked as a Research Assistant for the University of Michigan College of Engineering's Multidisciplinary Design Program. Her research interests include student involvement and engaged learning.
Gail Hohner is the Managing Director of the Multidisciplinary Design Program in the College of Engineering at the University of Michigan Ann Arbor, where she develops multidisciplinary engaged learning experiences in the engineering design process. She teaches the seminar in Leadership/Mentorship in Multidisciplinary Engineering Design and her research focuses on the improvements in the pedagogy of engineering design process instruction. She is the 2016 program chair of the DEED division of ASEE. She has a background of 17 years of industrial experience and holds B.S.E in Chemical Engineering and a M.S. in Food Science/Chemical Engineering from Cornell.
Evaluating the Pre-Professional Engineer: Project Team and Individual PerformanceAt a large midwestern research university, teams of 5-7 engineers from multiple disciplines worktogether on multi-term engineering design projects. Annually, the program enrolls approximately220 students on engineering design project teams. As we prepare to enter the fifth year of thisprogram, we are studying the effectiveness of scalable administration of effective peer reviewfeedback. Currently, we collect, anonymize and disseminate performance feedback utilizing theCATME team peer evaluation tool (Loughry, Ohland, & DeWayne, 2007) twice during thetenure of each team’s project. We observed that our students were unable to sufficientlyinterpret the feedback data, and their reactions to the feedback from their peers ranged fromcomplete dismissal to an inability to integrate the feedback into their performance. As a result,this has not generated our desired outcomes: improved project-specific engineering design skills,professional behavior, or evidence of self-reflection. To address this, we have created andimplemented an additional avenue for peer-to-peer anonymized feedback: a qualitative surveyutilizing coded competencies drawn from various industry-based best practices. In thissupplemental qualitative survey, students select two performance competencies (one positive andone developmental) from a predefined list for each of their peers. Students then support theselected competencies by citing specific behavioral examples.We coded these statements into three skill constructs, professionalism, teamwork, and core skills,using a unique combination of structural, attribute, provisional, and holistic qualitative codingtechniques. The coded data is analyzed against semi-anonymized demographic data in order todetermine how various identities affect the ways in which students evaluate and are evaluated bytheir peers. Utilizing action research methodology, we made relevant changes to the surveysprior to the second round of evaluations to improve students’ understanding of the surveyquestions.Student growth is measured via peers’ quantitative (CATME) and qualitative evaluations ofengineering design skills and professional behavior, as well as the students’ own ability to craftconstructive feedback statements. Because specific instances of behavior are cited in thequalitative survey, students are better equipped to leverage their strengths and addressshortcomings. We identify trends in peer evaluations with regard to self-identified gender,residency status, academic year, and other categories. The research team documents the ways inwhich students evaluate one another in an experiential team setting, the roles that certainacademic and social identities play in that evaluation process, and how students’ evaluationabilities change over time.
Adams, J. M., & DeGraaff, M. D., & Hohner, G. S. (2015, June), Evaluating the Pre-Professional Engineer: Exploring the Peer Review Process Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.24029
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015