Asee peer logo

Implementing a Single Holistic Rubric to Address Both Communication and Technical Criteria in a First Year Design-Build-Test-Communicate Class

Download Paper |

Conference

2017 ASEE Annual Conference & Exposition

Location

Columbus, Ohio

Publication Date

June 24, 2017

Start Date

June 24, 2017

End Date

June 28, 2017

Conference Session

Writing and Communication

Tagged Division

Liberal Education/Engineering & Society

Page Count

18

DOI

10.18260/1-2--28479

Permanent URL

https://peer.asee.org/28479

Download Count

808

Request a correction

Paper Authors

biography

Stephanie Sheffield University of Michigan

visit author page

Dr. Sheffield is a Lecturer in Technical Communication in the College of Engineering at the University of Michigan.

visit author page

biography

Robin Fowler University of Michigan

visit author page

Robin Fowler is a lecturer in the Program in Technical Communication at the University of Michigan. She enjoys serving as a "communication coach" to students throughout the curriculum, and she's especially excited to work with first year and senior students, as well as engineering project teams, as they navigate the more open-ended communication decisions involved in describing the products of open-ended design scenarios.

visit author page

biography

Laura K. Alford University of Michigan

visit author page

Laura K. Alford is a Lecturer and Research Investigator at the University of Michigan.

visit author page

biography

Katie Snyder University of Michigan

visit author page

Dr. Snyder is a lecturer in the Program in Technical Communication at the University of Michigan. She teaches writing and presentation strategies to students in the College of Engineering.

visit author page

Download Paper |

Abstract

Rubrics are valuable tools for making learning goals and evaluation criteria explicit for students and instructors in engineering communication courses. This presentation will focus on the evolution of the rubrics used in a team-based, project-based engineering communication course, in which students work on teams of 4 or 5 students to design, build, and test underwater vehicles; the grades for these DBT projects are assigned by technical faculty and communication faculty together, based solely on the resulting oral and written design reports.

For the past 5 years, the project-based learning projects used two separate analytic rubrics that separated the available points into a list of concerns under technical and communication headings; the technical faculty would evaluate the projects using the technical rubric, and the communication faculty would grade using the communication rubric. It was a straightforward system, but the entire faculty team was troubled by the limitations of this approach to grading: students seemed to use the rubrics as series of discrete checkboxes instead of considering how the different parts of the assignment contributed to the whole; the association of points with the specific categories on the analytic rubric seemed to lead students to focus too intently on the number of points earned or lost for a specific category, rather than on the pedagogical goals those categories were intended to represent; at times, the faculty felt restricted by the analytic rubric’s lack of flexibility to accommodate and value the different approaches required by different design choices made in the open-ended DBT projects (e.g. some designs required much more detailed discussion of stability, while others required more images or a greater focus on consistent performance, but teams would fulfill the requirements of the rubric instead of making effective rhetorical decision based on the needs of their project and audience); and most importantly, the analytic rubric didn’t effectively represent for the students the way(s) in which design deliverables are evaluated in a non-academic context, and therefore writing in response to those rubrics wasn’t providing the “authentic,” real-world design project experience that is one of the goals of this introductory engineering course.

This year, the communication and technical instructors worked together to create a single holistic rubric that would be used for all grading and that focused on the overall impression the student deliverables would make on a supervisor evaluating the work before passing it along to a client. The new rubric provides 5 specific bins, each associated with a grade percentage from 100% to 73.5%, into which deliverables might fall, with each bin providing a description of the quality/condition of the work and the likely supervisor response to it. (The rubric indicates that scores lower than 73.5% are possible, but does not provide specific descriptions of the work associated with those lower scores, apart from a statement that such work would likely be largely incomplete.)

In making this change, we anticipate several benefits: The holistic rubric will encourage students to think rhetorically when documenting and describing their work: “What do we need to do to communicate OUR design well, and to persuade THIS audience of its appropriateness given THIS scenario?” Such an approach likely transfers more accurately to their eventual professional lives; the more flexible, holistic approach will allow us to value both the innovative design decisions made by our students, as well as the rhetorical choices those students make in representing those design decisions in their written and oral reports; the new rubric will enable us to more appropriately reward high quality work overall: there aren’t tenths of points associated with specific things that may not apply equally well to all designs, and there aren’t ways that teams might excellently support particular design choices that are omitted from a detailed analytic rubric (and are therefore difficult to reward).

In undertaking this revision to our grading approach, we recognized that there was a potential downside in moving from an analytic rubric to one that evaluates based on the impression of the entire document: given that holistic rubrics are by definition less detailed, we would risk losing some of the existing transparency about how grades are assigned, and this might cause anxiety for our students who are frequently--and understandably--concerned more with the immediacy of the grade their work receives than the long-term pedagogical goals of the course.

We attempted to alleviate this potential loss of transparency in two ways: 1) by continuing our practice of providing detailed written feedback on assignments (without tying that feedback to specific numbers of points gained or lost), and 2) by creating a brief, but detailed document for each of the two major DBT projects that explains what excellent design reports on this project would include, while allowing for the flexibility that was lacking from the analytic rubric.

In this paper, we will explore how instructional use of these scoring rubrics affects how student teams approach engineering communication assignments, and what influence the use of these artifacts has on students’ understanding of quality in engineering communication. We also report on instructor experiences using the two disparate rubric types, including how we were able to combine technical and communication goals and whether our use of these documents made grading easier or harder.

The previous and current rubrics will be attached in an appendix to the final paper.

Sheffield, S., & Fowler, R., & Alford, L. K., & Snyder, K. (2017, June), Implementing a Single Holistic Rubric to Address Both Communication and Technical Criteria in a First Year Design-Build-Test-Communicate Class Paper presented at 2017 ASEE Annual Conference & Exposition, Columbus, Ohio. 10.18260/1-2--28479

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2017 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015