New Orleans, Louisiana
June 26, 2016
June 26, 2016
June 29, 2016
978-0-692-68565-5
2153-5965
New Engineering Educators
18
10.18260/p.26444
https://peer.asee.org/26444
1758
Dr. Hylton is an Assistant Professor of Mechanical Engineering at Ohio Northern University. He previously completed his graduate studies in Mechanical Engineering at Purdue University, where he conducted research in both the School of Mechanical Engineering and the School of Engineering Education. Prior to Purdue, he completed his undergraduate work at the University of Tulsa, also in Mechanical Engineering. He currently teaches first-year engineering courses as well as various courses in Mechanical Engineering, primarily in the mechanics area. His pedagogical research areas include standards-based assessment and curriculum design, the later currently focused on incorporating entrepreneurial thinking into the engineering curriculum.
Heidi A. Diefes-Dux is a Professor in the School of Engineering Education at Purdue University. She received her B.S. and M.S. in Food Science from Cornell University and her Ph.D. in Food Process Engineering from the Department of Agricultural and Biological Engineering at Purdue University. She is a member of Purdue’s Teaching Academy. Since 1999, she has been a faculty member within the First-Year Engineering Program, teaching and guiding the design of one of the required first-year engineering courses that engages students in open-ended problem solving and design. Her research focuses on the development, implementation, and assessment of modeling and design activities with authentic engineering contexts. She is currently a member of the educational team for the Network for Computational Nanotechnology (NCN).
Keywords: Assessment, Standards-Based, Exams
Grading and assessment in higher education has been an on-going point of professional and scholarly discussion. As the latest pedagogical trends have shifted in the direction of a more holistic, experiential approach to education through methods such as project-based and active learning, the education community has sought alternative ways to assess student learning in these systems. Towards this goal, there has been an increased focus on the viability and efficacy of standards-based assessment.
Standards-based assessment is an alternative to the traditional score-based grading approach, by which student assessment is conducted directly on identified course learning objectives. Students are assessed repeatedly on their achievement on these objectives while also being provided with clear, meaningful feedback on their progress. Additionally, standards-based assessment has been proposed as having a positive impact on perceived fairness and transparency of the assessment experience as well as benefits for program assessment. While variations of a standards-based approach have been previously applied to project-based courses and even to more traditional homework type assessments, reports of implementation in a traditional examination-based assessment environment are less commonly available in the literature.
In this paper, we look at the nature of the exams and standards-based rubrics and how they were implemented, communicate the lessons learned, and demonstrate how the other standards-based graded elements of the course interact with the exam to provide a more complete picture of student achievement. This effort was undertaken at a large, public university at which the first-year experience is conducted via a two-course sequence. Broadly speaking, the first course in this sequence introduces the concepts of engineering modeling and design. The second course then builds on this foundation while also introducing the foundations of programming and analysis. The focus of this work is on the second course in this sequence. Over the past several iterations of the course, all project and daily-work assessment activities have been converted to a standards-based grading system. As a continuation of this effort, the most recent iteration expanded the standards-based approach to the three course examinations. These exams include multiple choice, short answer, and coding response problems with a primary emphasis on the programming and analysis content.
Feedback on the standards-based methodology was provided by students via end-of-course surveys and generally indicated a positive reception. Concerns were largely focused around a perception among some students that the exam grading was too harsh. From a faculty perspective, the approach was viewed favorably although there was some difficulty in interpreting and applying the rubrics and a concern that the constraints of the rubric led to over penalizing students in certain situations. Further, it was noted that deconstructing questions into multiple, more specific learning objectives rather than assessing wholly on a single large objective mitigated many of those concerns. The generally positive reception of the new approach indicates that the standards-based assessment strategy, if correctly applied, does not negatively impact the efficacy of the exam as an assessment tool and enables significant benefits in terms of speed and transparency of feedback as well as incorporation into a larger course or program standards-based assessment scheme.
Hylton, J. B., & Diefes-Dux, H. A. (2016, June), A Standards-based Assessment Strategy for Written Exams Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.26444
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015