June 23, 2013
June 23, 2013
June 26, 2013
Liberal Education/Engineering & Society
23.763.1 - 23.763.17
Achieving Broad Usability for a Suite of Tools to Evaluate Workforce Presentation Instruction for EngineersOver the past decade and more, many engineering schools have been working to implementeffective oral presentation in their instruction. But the problem of engineering students’ lack oforal presentation skills persists. During the past three years at [affiliation removed for blindreview], we have built, tested and implemented a set of tools that we know improves studentpresentation skills at a significant level. The set of tools is based on empirical research (inparticular, input from executives with engineering degrees and have been requested by 180universities to date. The set of tools includes a scoring system listing 19 skills, a teachers’ guideand a description of the “wow” performance for each skill. We learned from a recent study thatsome skills have different levels of inter-rater reliability than others. In other words, some skillsare rated more consistently than others among different raters. For example, when differentfaculty or TAs see the same presentation and rate a particular skill the same, we have high inter-rater reliability on that skill.In this paper we describe our current work to improve the inter-rater reliability of certain skills.We have used a modified Delphi method to improve the inter-rater reliability and are in theprocess of implementing the revised suite of tools in three very different engineeringdepartments (Industrial, Biomedical, and Aerospace) at [removed for blind review]. The Delphimethod is a process of structured communication designed to aid a group of individuals incoming to consensus about a complex issue. This method usually employs several consecutiverounds of feedback. We have used two rounds of feedback from the various stakeholders:faculty, students, teaching assistants, and executives. In each round of feedback individuals firstprovided comments on the individual skills in the scoring system. Then we summarized thefeedback from all the individuals and asked the individuals to reflect upon the summary to see iftheir opinions had changed. At the end of the feedback for each of the two rounds, we produceda written document synthesizing the overall responses of all the individuals. In each round, wemodified the scoring system according to the feedback before moving ahead.The resulting scoring system includes modifications for about half of the skills. For example, theskill of appropriate graphics has been combined with the skill of engaging graphics becausestakeholders were unable to distinguish between the two skills. And the training for the skill ofpersonal presence (which includes energy, inflection, eye contact and movement) was changed torecommend different ratings for specific combinations of performance on the four components ofthe skill.At ASEE we will be able to report data demonstrating the new level of reliability indicating thebroad usability of the suite of tools.
Utschig, T. T., & Bryan, J. S., & Norback, J. S. (2013, June), Insights into the Process of Building a Presentation Scoring System for Engineers Paper presented at 2013 ASEE Annual Conference & Exposition, Atlanta, Georgia. https://peer.asee.org/19777
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015