Asee peer logo

Development and Use of a Client Interaction Rubric for Formative Assessment

Download Paper |

Conference

2017 ASEE Annual Conference & Exposition

Location

Columbus, Ohio

Publication Date

June 24, 2017

Start Date

June 24, 2017

End Date

June 28, 2017

Conference Session

Student Feedback and Assessment in Design

Tagged Division

Design in Engineering Education

Page Count

13

DOI

10.18260/1-2--28157

Permanent URL

https://peer.asee.org/28157

Download Count

1180

Paper Authors

biography

John K. Estell Ohio Northern University

visit author page

Dr. John K Estell is Professor of Computer Engineering and Computer Science at Ohio Northern University, providing instruction primarily in the areas of introductory computer programming and first-year engineering. He has been on the faculty of the Electrical & Computer Engineering and Computer Science Department since 2001, and served as department chair from 2001-2010. He received a B.S.C.S.E. degree from The University of Toledo and the M.S. and Ph.D. degrees in Computer Science from the University of Illinois at Urbana-Champaign. Dr. Estell is a Fellow of ASEE, a Senior Member of IEEE, and a member of ACM, Tau Beta Pi, Eta Kappa Nu, Phi Kappa Phi, and Upsilon Pi Epsilon.

Dr. Estell is active in the assessment community with his work in streamlining and standardizing the outcomes assessment process, and has been an invited presenter at the ABET Symposium. He is also active within the engineering education community, having served ASEE as an officer in the Computers in Education and First-Year Programs Divisions; he and his co-authors have received multiple Best Paper awards at the ASEE Annual Conference. His current research includes examining the nature of constraints in engineering design and providing service learning opportunities for first-year programming students through various K-12 educational activities. Dr. Estell is a Member-at-Large of the Executive Committee for the Computing Accreditation Commission of ABET, and also serves as a program evaluator for the Engineering Accreditation Commission. He is also a founding member and serves as Vice President of The Pledge of the Computing Professional, an organization dedicated to the promotion of ethics in the computing professions through a standardized rite-of-passage ceremony.

visit author page

biography

Susannah Howe Smith College

visit author page

Susannah Howe, Ph.D. is the Design Clinic Director in the Picker Engineering Program at Smith College, where she coordinates and teaches the capstone engineering design course. Her current research focuses on innovations in engineering design education, particularly at the capstone level. She is invested in building the capstone design community; she is a leader in the biannual Capstone Design Conferences and the Capstone Design Hub initiative. She is also involved with efforts to foster design learning in middle and high school students and to support entrepreneurship at primarily undergraduate institutions. Her background is in civil engineering with a focus on structural materials. She holds a B.S.E. degree from Princeton, and M.Eng. and Ph.D. degrees from Cornell.

visit author page

Download Paper |

Abstract

This abstract describes initial efforts to develop a rubric in support of student-client interactions for client-oriented project-based learning activities. The rubric has been tested in two environments: (1) a user interface design course at a small private Midwest university taken by both computer engineering and computer science majors, and (2) an engineering capstone design course at small private Northeast liberal arts college. The goal of this research is to develop and disseminate a versatile rubric that can be used for formative assessment in a variety of settings involving student-client interactions.

There is a growing movement within the engineering education community towards incorporating real world design experiences into the curriculum, where teams of students work with or for a client to solve a problem. In these circumstances, clients are generally aware that they are working with students instead of professionals, and so are more willing to provide formative feedback to critique student efforts. One form of easily providing such feedback is through the use of rubrics; unfortunately, a literature search failed to turn up any rubrics designed specifically for student-client interactions. Accordingly, the development of a “Client Interaction Rubric” as discussed here fulfills this identified need while serving two purposes: obtaining formative feedback from the clients to help improve students’ client interaction skills, and providing students ahead of time with a framework of key criteria regarding having successful interactions with clients.

Recent research has shown that the single point rubric, a variant of the analytical rubric, can be particularly effective in providing formative feedback. Its three-column tabular format provides a single set of criteria-based performance standards for meeting expectations (middle column), plus space for qualitative responses for performance that is either above or below these expectations (left and right columns, respectively). The single point rubric approach was adopted for the Client Interaction Rubric because of its simplicity and clarity. The criteria and corresponding performance standards were based on a set of outcomes associated with the entrepreneurial mindset. Subsequent refinement of the rubric was informed through conversations with an industrial advisory board, faculty colleagues, students, and clients.

The initial pilot testing of the rubric occurred in Fall 2016. In the user interface design course, students met monthly with their external client. Following these meetings, the client met with the instructor to review the effectiveness of the rubric and to complete the rubric based on the observed student performance. The assessed performance results were then shared with the students, who then wrote individual responses reflecting upon the provided feedback. In the engineering capstone design course, students held kick-off meetings with liaisons from their sponsoring organizations. In the next meeting with their faculty advisor, the rubric was used as a guiding template for informally debriefing the kick-off meetings and for planning strategies for improving student performance in future meetings. The rubric also served as a tool for debriefing later liaison meetings in a similar fashion.

The most current version of the Client Interaction Rubric will be provided for review and comment by the greater design community; early adopters will be provided with the means for downloading a copy of the rubric. Next steps regarding this research include developing a set of recommended settings/situations for such rubric use, adapting the rubric for use in different client environments, and developing a methodology for assessing the efficacy of the rubric.

Estell, J. K., & Howe, S. (2017, June), Development and Use of a Client Interaction Rubric for Formative Assessment Paper presented at 2017 ASEE Annual Conference & Exposition, Columbus, Ohio. 10.18260/1-2--28157

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2017 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015