June 20, 2010
June 20, 2010
June 23, 2010
Educational Research and Methods
15.1021.1 - 15.1021.11
Refinement and Initial Testing of an Engineering Student Presentation Scoring System
We have previously created and beta tested a workforce-relevant, research-based scoring system for use with engineering student presentations across multiple contexts. Since then, we have systematically validated, refined, and tested the rubric in a five-step process described in some detail for this paper. First, we tested the face validity and usability of the instrument via the collection of additional feedback during focus groups and interviews with: faculty possessing expertise in scoring system design, faculty with experience in engineering design projects that involve student presentations, and additional faculty from a variety of backgrounds. Second, we used this feedback to reduce overlap and complexity in the scoring system items. Third, teaching assistants and the researchers used the scoring system items to provide feedback to approximately 140 students on presentations in a senior design course. Fourth, we made additional modifications and simplifications to the system based on the insights gained from the TA feedback process. Fifth and finally, three raters applied the resulting scoring system to several videotaped student presentations to check for inter-rater reliability and evidence of construct validity. Based on the methodology above, we reduced the instrument from 36 items to 19 items. These items include using concrete examples and details familiar to the audience; consistently referring to how key points fit into the big picture; using graphics which are visually appealing, easy-to-understand, and include helpful labeling; and effectively combining energy, inflection, eye contact and movement; among others. This paper includes a description of the process used to create the instrument, a description of the instrument, the supplemental teaching guidelines under development, and a discussion of the materials’ potential for use across many engineering contexts.
At Georgia Tech, with funding from the Engineering Information Foundation and approval from the Institutional Review Board for research with human subjects, we have created and beta tested a workforce-relevant, research-based scoring system for use with engineering student presentations across multiple contexts. The scoring system is designed to enhance students’ presentation skills so they can perform better in class, get a better job, and move quickly up the career ladder. In addition, the system addresses needs for outcomes assessment in communication skills for ABET, can improve the reliability and validity of scoring for engineering student presentations by faculty, and serves as a tool to help match instruction to the assessment and evaluation of student performances involving engineering communication.
In this paper we cover three aspects of the scoring system, its development and its use. First, we describe the current version of the system along with examples from the supplemental teaching guidelines for professors and teaching assistants to use when instructing, assessing, and evaluating engineering student presentations in any university. Together, these tools provide the basis for providing presentation instruction, even by instructors who are not experts in
Utschig, T., & Norback, J. (2010, June), Refinement And Initial Testing Of An Engineering Student Presentation Scoring System Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16824
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015