Asee peer logo

Evaluating Freshman Engineering Design Projects Using Adaptive Comparative Judgment

Download Paper |

Conference

2017 ASEE Annual Conference & Exposition

Location

Columbus, Ohio

Publication Date

June 24, 2017

Start Date

June 24, 2017

End Date

June 28, 2017

Conference Session

Assessment of Student Work

Tagged Division

Educational Research and Methods

Page Count

14

DOI

10.18260/1-2--28301

Permanent URL

https://peer.asee.org/28301

Download Count

724

Paper Authors

biography

Greg J. Strimel Purdue Polytechnic Institute Orcid 16x16 orcid.org/0000-0002-4847-4526

visit author page

Dr. Greg J. Strimel is an assistant professor of engineering/technology teacher education in the Purdue Polytechnic Institute at Purdue University in West Lafayette, Indiana. His prior teaching experience includes serving as a high school engineering/technology teacher and a teaching assistant professor within the College of Engineering & Mineral Resources at West Virginia University.

visit author page

biography

Scott R. Bartholomew Purdue University

visit author page

My interests revolve around adaptive comparative judgment, integrated STEM learning, Technology & Engineering Design learning, and self-directed learning. I have taught at the middle-school, high school, and collegiate levels and am dedicated to strengthening Technology & Engineering Education.

visit author page

biography

Andrew Jackson Purdue Polytechnic Institute Orcid 16x16 orcid.org/0000-0003-2882-3052

visit author page

Andrew Jackson is currently pursuing a PhD in Technology through Purdue's Polytechnic Institute, with an emphasis on Engineering and Technology Teacher Education. His research interests are engineering self-efficacy, motivation, and decision making. Andrew is the recipient of a 2015 Ross Fellowship from Purdue University and has been recognized as a 21st Century Fellow by the International Technology and Engineering Educators Association. He completed his Master of Science in Technology Leadership and Innovation at Purdue University with a thesis investigating middle school engineering self-efficacy beliefs. He previously taught middle school and undergraduate technology courses, accompanying both experiences with classroom research to improve practice.

visit author page

biography

Michael Grubbs Baltimore County Public Schools

visit author page

Previous to my current position as Supervisor of Technology, Engineering, and Manufacturing Education of Baltimore County Public Schools, I was a Virginia Tech GRA and educator in Clayton County Public Schools.

visit author page

biography

Daniel Gordon Mendiola Bates North Carolina State University

visit author page

PhD candidate in STEM Ed - Technology, Engineering, and Design (TED) at NC State University. Research interests include Engineering mindset, model-based reasoning, computational thinking in TED, and entrepreneurial influence in TED education. 4 years k-12 teaching experience.

visit author page

Download Paper |

Abstract

Evaluating Freshman Engineering Design Projects Using Adaptive Comparative Judgment

This evidence-based practice paper examines the use of a relatively new form of assessment, adaptive comparative judgment, and considers its reliability, validity, and feasibility in contrast to traditional assessment techniques. Engineering programs are often the home of multiple open-ended student design projects. The common method of assessment for most design projects is using a predetermined rubric to assign scores to student work (Pollitt, 2004). These assigned scores can be holistic in nature or based on micro-judgments that are summated to produce a macro-judgment of student performance (Pollitt, 2004; Kimbell, 2012). However, a problematic issue with the traditional scoring of student design work using rubrics is the low reliability when multiple graders assess student work (Pollitt, 2004, 2012). As a solution to this issue, Pollitt (2004) presents an alternative form assessment known as adaptive comparative judgment (ACJ). ACJ is a form of assessment that relies on comparisons of student work rather than rubrics. Bartholomew et al. (2016) explains this method as the process of showing judges a piece of work (e.g., essays, pictures, technical drawings, engineering notebooks, or design portfolios) from two different students or student groups with the direction to rate which piece of work is better. The judges are not asked to provide a grade for each piece of work but rather asked to provide a holistic decision as to which artifact is better based on their own professional opinion. In each round of judgment, each artifact is compared to another. Rounds of judgment are conducted until a sufficient reliability level is reached and a final rank-order for student work is obtained. While some may argue against the idea of comparing students to one another, Kimbell (2012) and Pollitt (2004) explain that any kind of assessment is essentially a comparison of one thing to another. As Pollitt states, “All judgments are relative. When we try to judge a performance against grade descriptors we are imagining or remembering other performances and comparing new performances to them” (2004, p. 6). The ACJ method of assessment has proven to be more reliable and valid than the traditional methods of assessment (Bartholomew et al., 2016; Kimbell 2012; Pollitt, 2004, 2006, 2012). The theoretical development of the ACJ assessment method has led to the formation of a grading engine by TAG Assessment titled CompareAssess. This product provides a platform for student work to be easily rated by multiple judges and algorithmically outputs the rank-order and standardized scores of relative work quality. This paper will examine the use of CompareAssess as a means for evaluating engineering design projects of undergraduate engineering students by using multiple judges to compare the design artifacts of 16 undergraduate engineering students. The authors will analyze the reliability and validity of this method when compared to the performance data of each student’s solution and the traditional rubric used to evaluate the project.

Strimel, G. J., & Bartholomew, S. R., & Jackson, A., & Grubbs, M., & Bates, D. G. M. (2017, June), Evaluating Freshman Engineering Design Projects Using Adaptive Comparative Judgment Paper presented at 2017 ASEE Annual Conference & Exposition, Columbus, Ohio. 10.18260/1-2--28301

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2017 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015