Asee peer logo

Towards the Development of an Objective Assessment Technique for use in Engineering Design Education

Download Paper |


2012 ASEE Annual Conference & Exposition


San Antonio, Texas

Publication Date

June 10, 2012

Start Date

June 10, 2012

End Date

June 13, 2012



Conference Session

The Best of Design in Engineering

Tagged Division

Design in Engineering Education

Page Count


Page Numbers

25.1366.1 - 25.1366.13



Permanent URL

Download Count


Request a correction

Paper Authors


Scarlett R. Miller Pennsylvania State University, University Park

visit author page

Scarlett Miller is an Assistant Professor of engineering design and industrial engineering at the Pennsylvania State University where she holds the James F. Will Career Development Professorship. She received her Ph.D. in industrial engineering from the University of Illinois and her M.S. and B.S. in industrial engineering from the University of Nebraska.

visit author page


Brian P. Bailey University of Illinois, Urbana-Champaign

visit author page

Brian Bailey is an Associate Professor in the Department of Computer Science at the Univeristy of Illinois, Urbana-Champaign. His research interests include creativity support tools, attention management systems, and, more generally, improving interactions between people and technology. Bailey received the NSF CAREER award in 2007 for his research in the area of human-computer interaction.

visit author page

author page

Alex Kirlik University of Illinois, Urbana-Champaign

Download Paper |


Towards the Development of an Objective Assessment Technique for use in Engineering Design EducationAssessment of student achievement in design is an important part of engineering education,especially with the ever-increasing role of design in the engineering classroom. However,engineering programs face special challenges in assessing students design capabilities andproviding meaningful feedback because engineering design is mostly subjective in that there areno mathematical proofs or conclusive experiments to grade. In light of this difficulty, educatorshave developed a variety of ways to assess student performance during design courses typicallyinvolving some combination of written reports, presentations, quizzes, prototypes, peerevaluations or evaluator judgments of student project deliverables (written and oral reports anddesign models). However, research has shown that these methods are often insufficient inevaluating student design performance because quantitative tests of students design skills (tests,quizzes, etc) are not good indicators of students design performance and open-ended projectevaluations are mudded by the subjective biases of the evaluator.The purpose of this paper is to evaluate a new objective assessment techniques ability todetermine student competence in design and compare this measure to traditional course measureslike exam averages and final course grades. Our method utilizes the Bayesian Truth Serum(BTS), an algorithm developed for financial engineering that has been validated boththeoretically and empirically and has proven to be a solid way to identify experts whensubjective judgments remains the only source of evidence available, and there is a possibility thatmost people may be wrong. Because design is inherently subjective, this method provides a wayto evaluate respondents accurately. This approach requires participants to complete a survey ofsubjective questions and provide not only a personal response to each question, but also aprediction of the empirical distribution of the other respondents. The scoring system for BTSrelies on two calculations: the respondents ability to correctly predict others responses, and thecalculation of ‘surprisingly common’ responses.An empirical study was completed at a large public institution with 47 junior and seniorengineering students in an upper-level engineering design class. The study was completed in twophases. In phase 1 a BTS score was tabulated for each participant based on a series of responsesto survey questions, which asked participants to rate the design quality of pictorial designexamples. They were also asked to predict the empirical distribution of responses of the otherparticipants in each of these questions. In phase 2, participants were asked to develop designideas for an engineering design problem and select their best idea. This idea was then judged forits ability to solve the design problem by 10 other participants.The results from this experiment show some interesting findings with implications forengineering design skill assessment. First, BTS appears to be a better indicator of student designability than the traditional measures of course grade and test average. However, the algorithmneeded some alterations, which were tested, to improve its accuracy. The results from this studyare promising and provide a first step at deriving quantitative measures of student designperformance in engineering.

Miller, S. R., & Bailey, B. P., & Kirlik, A. (2012, June), Towards the Development of an Objective Assessment Technique for use in Engineering Design Education Paper presented at 2012 ASEE Annual Conference & Exposition, San Antonio, Texas. 10.18260/1-2--22123

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2012 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015