Asee peer logo

Incremental Self-Assessment Rubrics for Capstone Design Courses

Download Paper |

Conference

2015 ASEE Annual Conference & Exposition

Location

Seattle, Washington

Publication Date

June 14, 2015

Start Date

June 14, 2015

End Date

June 17, 2015

ISBN

978-0-692-50180-1

ISSN

2153-5965

Conference Session

Capstone Design

Tagged Division

Design in Engineering Education

Tagged Topic

Diversity

Page Count

17

Page Numbers

26.951.1 - 26.951.17

DOI

10.18260/p.24288

Permanent URL

https://peer.asee.org/24288

Download Count

321

Request a correction

Paper Authors

biography

James Trevelyan University of Western Australia Orcid 16x16 orcid.org/0000-0002-5014-2184

visit author page

Professor James Trevelyan works part-time as a Winthrop Professor in the Mechanical and Chemical Engineering School at The University of Western Australia, Fellow of Engineers Australia, and also practices as a mechanical and mechatronics engineer developing new air conditioning technology.

His main area of research is on engineering practice, and he teaches design, sustainability, engineering practice and project management.

He is well known internationally for pioneering research that resulted in sheep shearing robots (1975-1993). He and his students produced the first industrial robot that could be remotely operated via the internet in 1994. He was presented with the 1993 Engelberger Science and Technology Award in Tokyo in recognition of his work, and has twice been presented with the Japan Industrial Robot Association award for best papers at ISIR conferences. These are the leading international awards for robotics research. He has also received university, national and international awards for his teaching and papers on engineering education.

From 1996 till 2002 he researched landmine clearance methods and his web site is an internationally respected reference point for information on landmines. He was awarded with honorary membership of the Society of Counter Ordnance Technology in 2002 for his efforts, and was also elected a Fellow of the Institution of Engineers Australia.

Professor Trevelyan’s web page is http://www.mech.uwa.edu.au/jpt/ providing further information on his research and teaching.

visit author page

Download Paper |

Abstract

Additive Rubrics for Capstone Design CoursesABSTRACTWhile assessment rubrics have been used to help with tutor and self-assessment inengineering design, it is not easy to find examples to guide the preparation of specific rubricsfor a design course. The difficulty of describing design knowledge has been one factor thatmight explain this (Bailey & Szabo, 2006; McKenzie, Trevisan, Davis, & Beyerlein, 2004).Recent education research has demonstrated some of the educational benefits in using rubricsto guide self-assessment, though much depends on the quality of rubric design (Johnsson &Svingby, 2007). While both self- and peer-assessment can provide significant assessmenttime-saving for tutors, self-assessment has distinct student learning advantages according torecent research.The development of new capstone design courses prompted a search for effective assessmentmethods that would allow effective use of staff time in a resource constrained environment.While self- and peer-assessment seemed to provide promising avenues to reduce assessmenttime demands on staff, the research literature has not yet demonstrated how reliable they canbe in the context of capstone design courses. With gaps in the literature explained above, itwas necessary to design our own assessment rubrics. Given that self-assessment providesadded learning benefits in comparison with peer-assessment, we decided to adopt thisapproach (Johnsson & Svingby, 2007). This paper explains the assessment instruments andpresents examples to enable others to build on them and improve them.After four years of development and evaluation, this author has found that effective self-assessment requires a rubric that enables students with only basic content understanding toappreciate what is required for each level of attainment. Most of the rubrics available in theliterature are ‘subtractive’ in the sense that each level of attainment describes what is missing,compared with an ideal submission (e.g. Moskal, 2000; Spurlin, Rajala, & Lavelle, 2008;Trevisan, Davis, Calkins, & Gentili, 1999). The difficulty with this approach is that thestudent with only basic content understanding does not yet appreciate what is involved inpreparing an ideal submission. An ‘additive’ approach in which each level of attainment isdescribed as an increment on the previous level seems to be much easier for students tounderstand and follow.The use of self-assessment not only promotes student learning as reported in the researchliterature. It also enables students to learn how to judge the quality of design work.Students were required to bring completed self-assessment rubrics to weekly design tutorialsto grade homework consisting of drawing and writing exercises completed in a journal. Thehome work included reflective writing tasks. The rubrics also contained assessment guidesfor in-class exercises which the students also completed before the end of the class. Tutorsinspected students’ journals and in-class submissions to check self-assessments and modifythem when necessary. While doing this, the tutors provided face-to-face individual verbalfeedback to each student.Out of class marking and assessment was almost completely eliminated from the course: therubrics enabled all of this to be completed during class times. The results from completedpaper rubrics could be transferred to the learning management system grades database byadministrative staff.Grading of a major semester project requiring an individual report from each student wascompleted in about two thirds of the time required before the self-assessment rubrics wereintroduced. The paper includes details on the resource requirements to run the design coursefor a large class.Additive assessment rubrics have relieved tutors from most preparation and out of classmarking duties enabling them to spend almost all their time on face-to-face discussions withstudents. Student engagement, as assessed by the extent to which several hours of weeklyhomework was completed, was significantly improved. The paper includes comprehensiveguidance on the preparation of additive rubrics and several examples for different kinds ofhomework and in class exercises.KEYWORDSAssessment, capstone design, rubricREFERENCESBailey, R., & Szabo, Z. (2006). Assessing Engineering Design Process Knowledge.International Journal of Engineering Education, 22(3), 508-518. doi: 0949-149X/91Johnsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity andeducational consequences. Educational Research Review, 2, 130-144. doi:10.1016/j.edurev.2007.05.002McKenzie, L. J., Trevisan, M. S., Davis, D. C., & Beyerlein, S. W. (2004). Capstone DesignCourses and Assessment: A National Study. Paper presented at the American Society ofEngineering Education Annual Conference & Exposition.Moskal, B. M. (2000). Scoring rubrics: what, when and how? Practical Assessment, Research& Evaluation, 7(3).Spurlin, J. E., Rajala, S. A., & Lavelle, J. P. (2008). Designing better engineering educationthrough assessment: a practical resource for faculty and department chairs on usingassessment and ABET criteria to improve student learning: Stylus.Trevisan, M., Davis, D. C., Calkins, D. E., & Gentili, K. L. (1999). Designing sound scoringcriteria for assessing student performance. Journal of Engineering Education, 88(1), 79-85.

Trevelyan, J. (2015, June), Incremental Self-Assessment Rubrics for Capstone Design Courses Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.24288

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015