New Orleans, Louisiana
June 26, 2016
June 26, 2016
August 28, 2016
Educational Research and Methods
This Work in Progress abstract describes initial efforts at a small, private Midwest university to develop rubrics in order to assess student work for a term project in a second semester introductory programming course. This programming course has used the project theme of developing K-12 educational software for many years; however, this was done without having a client to satisfy. Consequently, the instructor could only provide feedback on the technical aspects of the implementation, and most of this feedback was summative. With the recent establishment of both an engineering education program and STEM outreach activities, the opportunity arose for providing the programming students with a meaningful client-driven design experience. The clients are education majors who are designing lesson plans for STEM outreach targeting fourth through sixth graders. The programming students are formed into teams to work with a particular client and develop an application in support of the client’s lesson plan. With the introduction of a client actively working with each team, the logistics are such that formative assessment techniques can be employed.
For the first year of this updated version of the term project, traditional analytic rubrics – where each dimension contains a descriptor for each level in the rating scale – were used. Post-activity assessment indicated that there was a communications-related disconnect between the clients and the teams; this resulted in a critical evaluation of the project’s structure to determine ways to improve formative feedback. Upon reflection, it was determined that developing and using single-point rubrics would provide the means for a formative assessment practice that would better serve each student’s learning experience. The single-point rubric is characterized by providing only one descriptor for each dimension; specifically, the criterion for proficient performance. The remaining performance levels for the dimension are left blank on the form. This makes rubric construction easier, as only the performance expectations now need to be stated; the laundry list of failures is no longer included. Moreover, student interpretation of the rubrics is streamlined since the rubric now precisely states what is considered proficient. Finally, the format makes formative assessment easier, as the reviewer is encouraged to provide either constructive feedback in areas needing work, or positive feedback where the standards have been exceeded, or both, by writing in the blank spaces provided under the Mastery, Developing, or Lacking rating scale columns for each dimension. The rubrics were developed by surveying available literature and tools for examples of assessment in application design, teaming, and entrepreneurship. These, along with existing assessment tools, were then synthesized into appropriate criteria for the single-point rubrics developed by the investigators. Initial feedback of their use has been positive, as the client reviewers were observed to be more engaged in the process and the students received both clear standards for their initial efforts and formative feedback to guide their subsequent efforts.
Current research efforts involve the refinement and validation of this first set of single point rubrics. Additionally, it was observed that external reviewers were unfamiliar with the single-point concept; accordingly, training materials need to be developed that will help both explain the application of single-point rubrics and assist in the norming of their use for greater inter-rater reliability. The long-term goals of this research are to make the developed single-point rubrics with the associated materials available to the wider engineering education community and to provide a process that others can use to develop their own single-point rubrics.
Estell, J. K., & Sapp, H. M., & Reeping, D. (2016, June), Work in Progress: Developing Single Point Rubrics for Formative Assessment Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.27221
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015