New Orleans, Louisiana
June 26, 2016
June 26, 2016
August 28, 2016
Educational Research and Methods
In comparing two streamlined methods to assess design knowledge, a method focused on critiquing an existing design process is shown to more closely match observed design behaviors than an open-ended prompt. This research paper describes the comparison of two engineering design knowledge assessment methods in terms of their ability to assess the procedural nature of design knowledge. In particular, the focus is on the methods’ ability to assess if students know to engage in problem formulation activities early in a design project. Problem formulation activities are not only key drivers of design, but prior work has shown that fewer than 20% of students entering engineering programs recognize the role of and engage in problem formulation. Driving this research is viewing engineering design as a set of behaviors. As opposed to being declarative, factual knowledge, engineering design is about knowing how and when to do certain things (i.e., procedural knowledge). As such, the measurement of design knowledge needs to measure the procedural knowledge underlying effective design behaviors. Procedural assessment methods aimed at characterizing behaviors such as Verbal Protocol Analysis and ethnography of design teams, however, are prohibitively time intensive for engineering educators to use routinely in their classes. The focus here is on evaluating two less time-intensive, or “streamlined,” assessments with respect to their ability to capture procedural knowledge. The “open-ended” assessment asks students to describe the process they would use to design a particular product. The “critique” assessment asks students to critique a proposed process for designing the same product. Both of these are compared to students’ actual behaviors when completing a weeklong design project.
Taken together, the research question addressed in this study is: Can streamlined declarative assessment techniques adequately capture procedural knowledge about problem formulation design activities?
Results show that students identify problem formulation activities with the open-ended assessment approach (69%, n=74) more often than they do with the critique-based approach (22%, n=74). During the weeklong design project, only 16% of teams performed key problem identification (n=25). There is no statistical difference between the critique-based approach results and actual behaviors from the design project. There is a significant difference, however, between the open-ended assessment results and actual behaviors. The primary explanation for why students included problem formulation activities in the open-ended assessment when they did not perform such behaviors is that the open-ended assessment elicits recall of facts (i.e., declarative knowledge) moreso than authentic behavior (i.e., procedural knowledge).
Bailey, R. (2016, June), Don't Tell Me What You Know, Tell Me What You Would Actually Do! Comparing Two Design Learning Assessment Approaches Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.26865
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015