June 15, 2019
June 15, 2019
June 19, 2019
Educational Research and Methods
Purpose. In this evidence-based teaching practice paper, we report on our efforts to design a means to assess problem framing ability. More faculty than ever are incorporating design into core engineering courses; in doing so, they may be concerned that adding design means sacrificing content. In the absence of a measure of problem framing ability, instructors tend to rely on existing assessments that typically focus on conceptual knowledge. We report on the development, implementation, and validation of the Design Skills Test (DST), an assessment of design problem framing ability.
Methodology. The DST includes an authentic design scenario and a coding scheme to characterize 1) factual and conceptual information used to frame the problem in terms of needs/constraints; 2) design practices used (e.g., generating ideas, considering multiple stakeholders, remaining tentative); and 3) stylistic choices (e.g., organizing their response, depicting context in representations). We developed three DST scenarios and tested them in a chemical engineering program that began threading design challenges throughout all core courses. We collected data over a three-year period (n=580). In the first and senior years, students completed the same DST twice as a pre/post measure. In the sophomore and junior years, they additionally completed a mid-year version. Students were given 15 minutes to work on the problem during class; instructors explained that there was no single right answer and that it would take a team many months to develop a solution, but that we were interested in how they start such a problem. To make data analysis feasible, two undergraduate peer-learning facilitators analyzed each DST independently (14 PLFs contributed), following minimal training.
Results. Using a validity-as-argument approach (Linn, 1994), we argue that the DST provides valid information about design problem-framing ability, provided the information is used for course improvement purposes. Inter-rater reliability for factual/conceptual codes was 65% to 83%; for practice codes 52% to 77%; and for stylistic codes 68% to 80%).
Conclusions. Our findings indicate that the DST sheds light on students’ design problem framing ability and provides valid evidence to help faculty evaluate the impact of incorporating design challenges, as not all design challenges support students to learn how to design. Given that professional engineering design practice relies on knowing how to frame problems, it is important for students to have opportunities to develop problem framing ability.
Implications. While reliability with minimal training was lower than would be acceptable for research purposes, we argue that for instructional purposes, this represents a significant reduction of faculty time. To enhance reliability, we worked with instructional designers to develop an online, self-paced training. Future studies will explore the impact of this training on reliability and explore the extent to which DSTs are predictive of later design behaviors.
White, L. D., & Svihla, V., & Chen, Y., & Hynson, T., & Drackert, I. A., & James, J. O., & Saul, C. Y., & Megli, A. C. (2019, June), Validating a Measure of Problem-Framing Ability to Support Evidence-Based Teaching Practice Paper presented at 2019 ASEE Annual Conference & Exposition , Tampa, Florida. https://peer.asee.org/33528
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2019 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015