June 14, 2015
June 14, 2015
June 17, 2015
Design in Engineering Education
26.951.1 - 26.951.17
Additive Rubrics for Capstone Design CoursesABSTRACTWhile assessment rubrics have been used to help with tutor and self-assessment inengineering design, it is not easy to find examples to guide the preparation of specific rubricsfor a design course. The difficulty of describing design knowledge has been one factor thatmight explain this (Bailey & Szabo, 2006; McKenzie, Trevisan, Davis, & Beyerlein, 2004).Recent education research has demonstrated some of the educational benefits in using rubricsto guide self-assessment, though much depends on the quality of rubric design (Johnsson &Svingby, 2007). While both self- and peer-assessment can provide significant assessmenttime-saving for tutors, self-assessment has distinct student learning advantages according torecent research.The development of new capstone design courses prompted a search for effective assessmentmethods that would allow effective use of staff time in a resource constrained environment.While self- and peer-assessment seemed to provide promising avenues to reduce assessmenttime demands on staff, the research literature has not yet demonstrated how reliable they canbe in the context of capstone design courses. With gaps in the literature explained above, itwas necessary to design our own assessment rubrics. Given that self-assessment providesadded learning benefits in comparison with peer-assessment, we decided to adopt thisapproach (Johnsson & Svingby, 2007). This paper explains the assessment instruments andpresents examples to enable others to build on them and improve them.After four years of development and evaluation, this author has found that effective self-assessment requires a rubric that enables students with only basic content understanding toappreciate what is required for each level of attainment. Most of the rubrics available in theliterature are ‘subtractive’ in the sense that each level of attainment describes what is missing,compared with an ideal submission (e.g. Moskal, 2000; Spurlin, Rajala, & Lavelle, 2008;Trevisan, Davis, Calkins, & Gentili, 1999). The difficulty with this approach is that thestudent with only basic content understanding does not yet appreciate what is involved inpreparing an ideal submission. An ‘additive’ approach in which each level of attainment isdescribed as an increment on the previous level seems to be much easier for students tounderstand and follow.The use of self-assessment not only promotes student learning as reported in the researchliterature. It also enables students to learn how to judge the quality of design work.Students were required to bring completed self-assessment rubrics to weekly design tutorialsto grade homework consisting of drawing and writing exercises completed in a journal. Thehome work included reflective writing tasks. The rubrics also contained assessment guidesfor in-class exercises which the students also completed before the end of the class. Tutorsinspected students’ journals and in-class submissions to check self-assessments and modifythem when necessary. While doing this, the tutors provided face-to-face individual verbalfeedback to each student.Out of class marking and assessment was almost completely eliminated from the course: therubrics enabled all of this to be completed during class times. The results from completedpaper rubrics could be transferred to the learning management system grades database byadministrative staff.Grading of a major semester project requiring an individual report from each student wascompleted in about two thirds of the time required before the self-assessment rubrics wereintroduced. The paper includes details on the resource requirements to run the design coursefor a large class.Additive assessment rubrics have relieved tutors from most preparation and out of classmarking duties enabling them to spend almost all their time on face-to-face discussions withstudents. Student engagement, as assessed by the extent to which several hours of weeklyhomework was completed, was significantly improved. The paper includes comprehensiveguidance on the preparation of additive rubrics and several examples for different kinds ofhomework and in class exercises.KEYWORDSAssessment, capstone design, rubricREFERENCESBailey, R., & Szabo, Z. (2006). Assessing Engineering Design Process Knowledge.International Journal of Engineering Education, 22(3), 508-518. doi: 0949-149X/91Johnsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity andeducational consequences. Educational Research Review, 2, 130-144. doi:10.1016/j.edurev.2007.05.002McKenzie, L. J., Trevisan, M. S., Davis, D. C., & Beyerlein, S. W. (2004). Capstone DesignCourses and Assessment: A National Study. Paper presented at the American Society ofEngineering Education Annual Conference & Exposition.Moskal, B. M. (2000). Scoring rubrics: what, when and how? Practical Assessment, Research& Evaluation, 7(3).Spurlin, J. E., Rajala, S. A., & Lavelle, J. P. (2008). Designing better engineering educationthrough assessment: a practical resource for faculty and department chairs on usingassessment and ABET criteria to improve student learning: Stylus.Trevisan, M., Davis, D. C., Calkins, D. E., & Gentili, K. L. (1999). Designing sound scoringcriteria for assessing student performance. Journal of Engineering Education, 88(1), 79-85.
Trevelyan, J. (2015, June), Incremental Self-Assessment Rubrics for Capstone Design Courses Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.24288
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015