Asee peer logo

Insights into the Process of Building a Presentation Scoring System for Engineers

Download Paper |

Conference

2013 ASEE Annual Conference & Exposition

Location

Atlanta, Georgia

Publication Date

June 23, 2013

Start Date

June 23, 2013

End Date

June 26, 2013

ISSN

2153-5965

Conference Session

Communication and Engineering Careers: Motivating Our Students

Tagged Division

Liberal Education/Engineering & Society

Page Count

17

Page Numbers

23.763.1 - 23.763.17

Permanent URL

https://peer.asee.org/19777

Download Count

55

Request a correction

Paper Authors

biography

Tristan T. Utschig Georgia Institute of Technology

visit author page

Dr. Tristan T. Utschig is a Senior Academic Professional in the Center for the Enhancement of Teaching and Learning and is Assistant Director for the Scholarship and Assessment of Teaching and Learning at the Georgia Institute of Technology. Formerly, he was a tenured Associate Professor of Engineering Physics at Lewis-Clark State College. Dr. Utschig has regularly published and presented work on a variety of topics including assessment instruments and methodologies, using technology in the classroom, faculty development in instructional design, teaching diversity, and peer coaching. Dr. Utschig completed his PhD in Nuclear Engineering at the University of Wisconsin–Madison.

visit author page

biography

Jeffrey S. Bryan

visit author page

Jeffrey S. Bryan is currently in his second-year of Georgia Tech's M.S. program in digital media. He attended Southern Utah University as an undergraduate, and majored in English education. He worked for several years as a trainer for AT&T, teaching adult learners, and as an editor for an opinion research company. He currently works as a Graduate Research Assistant in Georgia Tech's Center for the Enhancement of Teaching and Learning (CETL), where he assists with assessment and data analysis for ongoing CETL projects. His master's thesis is an analysis of choice and player narratives in video game storytelling.

visit author page

biography

Judith Shaul Norback Georgia Institute of Technology

visit author page

Dr. Judith Shaul Norback, Ph.D. is faculty and the Director of Workplace and Academic Communication in the Stewart School of Industrial and Systems Engineering at Georgia Institute of Technology. She has developed and provided instruction for students in industrial engineering and biomedical engineering and has advised on oral communication instruction at many other universities. The Workplace Communication Lab she founded in 2003 has had over 19,000 student visits. As of Spring 2013, she has shared her instructional materials with over 200 schools from the US, Australia, Germany, and South Korea. Dr. Norback has studied communication and other basic skills in the workplace and developed curriculum over the past 30 years—first at Educational Testing Service, then as part of the Center for Skills Enhancement, Inc., which she founded, and, since 2000, at Georgia Tech. She has published over 20 articles in the past decade alone, including articles in IEEE Transactions on Professional Communication, INFORMS Transactions on Education, and the International Journal of Engineering Education. Over the past ten years Norback has given over 40 presentations and workshops at nation-wide conferences such as the American Society for Engineering Education (ASEE), where she currently serves as chair of her division. Dr. Norback also holds an office for the Education Forum of INFORMS and has served as Associate Chair for the Capstone Design Conference. Much of her work in the past five years has been conducted with Tristan Utschig, Associate Director of Assessment at Georgia Tech’s Center for Teaching and Learning. Dr. Norback’s education includes a bachelor’s degree from Cornell University and master's and Ph.D. degrees from Princeton University. Her current research interests include increasing the reliability of the Norback & Utschig Presentation Scoring System for Engineers and Scientists and the cognitive constructs students use when creating a graph from raw data.

visit author page

Download Paper |

Abstract

Achieving Broad Usability for a Suite of Tools to Evaluate Workforce Presentation Instruction for EngineersOver the past decade and more, many engineering schools have been working to implementeffective oral presentation in their instruction. But the problem of engineering students’ lack oforal presentation skills persists. During the past three years at [affiliation removed for blindreview], we have built, tested and implemented a set of tools that we know improves studentpresentation skills at a significant level. The set of tools is based on empirical research (inparticular, input from executives with engineering degrees and have been requested by 180universities to date. The set of tools includes a scoring system listing 19 skills, a teachers’ guideand a description of the “wow” performance for each skill. We learned from a recent study thatsome skills have different levels of inter-rater reliability than others. In other words, some skillsare rated more consistently than others among different raters. For example, when differentfaculty or TAs see the same presentation and rate a particular skill the same, we have high inter-rater reliability on that skill.In this paper we describe our current work to improve the inter-rater reliability of certain skills.We have used a modified Delphi method to improve the inter-rater reliability and are in theprocess of implementing the revised suite of tools in three very different engineeringdepartments (Industrial, Biomedical, and Aerospace) at [removed for blind review]. The Delphimethod is a process of structured communication designed to aid a group of individuals incoming to consensus about a complex issue. This method usually employs several consecutiverounds of feedback. We have used two rounds of feedback from the various stakeholders:faculty, students, teaching assistants, and executives. In each round of feedback individuals firstprovided comments on the individual skills in the scoring system. Then we summarized thefeedback from all the individuals and asked the individuals to reflect upon the summary to see iftheir opinions had changed. At the end of the feedback for each of the two rounds, we produceda written document synthesizing the overall responses of all the individuals. In each round, wemodified the scoring system according to the feedback before moving ahead.The resulting scoring system includes modifications for about half of the skills. For example, theskill of appropriate graphics has been combined with the skill of engaging graphics becausestakeholders were unable to distinguish between the two skills. And the training for the skill ofpersonal presence (which includes energy, inflection, eye contact and movement) was changed torecommend different ratings for specific combinations of performance on the four components ofthe skill.At ASEE we will be able to report data demonstrating the new level of reliability indicating thebroad usability of the suite of tools.

Utschig, T. T., & Bryan, J. S., & Norback, J. S. (2013, June), Insights into the Process of Building a Presentation Scoring System for Engineers Paper presented at 2013 ASEE Annual Conference & Exposition, Atlanta, Georgia. https://peer.asee.org/19777

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015