San Antonio, Texas
June 10, 2012
June 10, 2012
June 13, 2012
2153-5965
Advances in Assessment of Communication and Interdisciplinary Competence
Liberal Education/Engineering & Society
15
25.1498.1 - 25.1498.15
10.18260/1-2--22255
https://peer.asee.org/22255
488
Tristan Utschig is a Senior Academic Professional in the Center for the Enhancement of Teaching and Learning and Assistant Director for the Scholarship and Assessment of Teaching and Learning at the Georgia Institute of Technology. In this role, he consults with faculty about planning and assessing educational innovation in the classroom. He also serves as an evaluator on educational research grants. Formerly, he was tenured Associate Professor of engineering physics at Lewis-Clark State College. Utschig has regularly published and presented work on a variety of topics, including assessment instruments and methodologies, using technology in the classroom, faculty development in instructional design, teaching diversity, and peer coaching. Utschig completed his Ph.D. in nuclear engineering at the University of Wisconsin, Madison, where he worked on safety issues for fusion reactor designs.
Judith Shaul Norback received her B.A. from Cornell magna cum laude and her master's and Ph.D. from Princeton. She has worked in the area of workplace communication skills for 25 years, starting at Educational Testing Service in 1987, then founding and directing the Center for Skills Enhancement, Inc., in 1993. Her clients included the National Skill Standards Board, the U.S. Department of Labor, and many universities. Norback joined Georgia Tech in 2000 to focus on the workplace communication skills of engineers and is general faculty and Director of Workplace and Academic Communication in the Stewart School of Industrial and Systems Engineering. In 2003, she founded the Workforce Communication Lab, which has had more than 16,000 student visits to date. The instruction she developed has been shown to make a significant difference in students' presentation skills during five semesters so far. Norback has published in refereed journals and conference proceedings, presented at national conferences, and is now Program Chair for her division in ASEE, VP of External Relations for INFORMS-ED, and Chair for Student Involvement for the 2012 Capstone Design Conference. She is working on a book called "Oral Communication Excellence for Engineers: What the Workforce Demands" for John H. Wiley & Sons (due in 2013) and several articles, while continuing to teach capstone design communication instruction and a course on journal article writing for graduate students. Her current research focus includes evaluating the reliability of the scoring rubric she and Tristan Utschig developed from executive input and identifying the cognitive schema used by students to create graphs from raw data.
Jeffrey S. Bryan is currently in his first year of Georgia Tech's M.S. program in digital media. He attended Southern Utah University as an undergraduate, and majored in English education. He worked for several years as a trainer for AT&T, teaching adult learners, and as an Editor for an opinion research company. He currently works as a Graduate Research Assistant in Georgia Tech's Center for the Enhancement of Teaching and Learning (CETL), where he assists with assessment and data analysis for ongoing CETL projects. His master's thesis involves an investigation of choice and transgression in video game storytelling.
Workforce Communication Instruction: Preliminary Inter-rater Reliability Data for an Executive-based Oral Communication RubricWe have conducted a preliminary inter-rater reliability study for an executive-based scoringrubric used to rate oral presentations. Our goal is to answer the question: do different ratersgive the same feedback on the same presentation? We have built the rubric based on inputfrom over 66 executives who have engineering degrees and work in a variety of settings. Forthe past three years our workplace presentation instruction has been based on the rubric.Reliability in our setting has been achieved by extensive in-person training. Also, dataindicates the rubric has been effective in improving oral presentation skills. To date, 155schools have requested and received the rubric and supporting instructional materials for use intheir own settings. As we continue to distribute the rubric we need to analyze to what extentdifferent raters of the same presentation provide similar feedback. Once this question isanswered, we will modify the rubric and document the training needed to improve the inter-rater reliability.Based on this study schools will be able to prioritize which pieces of the rubric to use in theircourses. Teachers and communication professionals can make decisions based on the mostimportant skills for their context and the most reliable skills of the rubric. Schools will also beable to use the results of this study to enhance the reliability between different raters and byidentifying the most reliable parts of the rubric.We have collected scores from raters in three different contexts. These are: (1) the researchersand teaching assistants rating videotaped presentations from capstone design, (2) a group ofASEE workshop attendees rating different videotaped presentations, and (3) students rating 20peer presentations in a class of 80 students. We analyzed the data collected from the ratersusing the following procedure: first, we conducted pairwise comparisons among raters of thesame presentations for frequency of exact matches; second, we conducted pairwisecomparisons for frequency of 1-pt score consistency (on a five point scale); and third, wecalculated Pearson correlation coefficients for rater consistency as compared to the “true”score, where true score is defined by scores given by either the director or workplace andacademic communication with thirty years of experience in this field or the course instructor,depending upon the setting.The inter-rater reliability results identified in this study were used in conjunction with bothformal and informal user feedback to modify the existing rubric. These modifications will havetwo aims – enhance ease of use and understanding of the rubric, and enhance reliability amongdifferent raters using the rubric. With these modifications completed, other schools using themodified rubric will be able to enhance the reliability among their scorers, and apply the rubricwith even greater confidence that the feedback their students receive will enhance theirperformance in oral presentations. Finally, we will make available the modified rubric and anoutline of the training needed to improve reliable scoring among raters.
Utschig, T. T., & Norback, J. S., & Bryan, J. S. (2012, June), Workforce Communication Instruction: Preliminary Inter-rater Reliability Data for an Executive-based Oral Communication Rubric Paper presented at 2012 ASEE Annual Conference & Exposition, San Antonio, Texas. 10.18260/1-2--22255
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2012 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015