Asee peer logo

Eliciting Students' Interpretations of Engineering Representations

Download Paper |

Conference

2012 ASEE Annual Conference & Exposition

Location

San Antonio, Texas

Publication Date

June 10, 2012

Start Date

June 10, 2012

End Date

June 13, 2012

ISSN

2153-5965

Conference Session

Assessment and Impact

Tagged Division

Multidisciplinary Engineering

Page Count

8

Page Numbers

25.513.1 - 25.513.8

DOI

10.18260/1-2--21271

Permanent URL

https://peer.asee.org/21271

Download Count

432

Request a correction

Paper Authors

biography

Adam R. Carberry Arizona State University Orcid 16x16 orcid.org/0000-0003-0041-7060

visit author page

Adam R. Carberry is an Assistant Professor in the College of Technology and Innovation, Department of Engineering at Arizona State University. He earned a B.S. in materials science engineering from Alfred University, and received his M.S. and Ph.D., both from Tufts University, in chemistry and engineering education respectively. His research interests include student conceptions, engineering epistemological beliefs, self-efficacy, and service-learning.

visit author page

biography

Ann F. McKenna Arizona State University, Polytechnic

visit author page

Ann F. McKenna is Chair of the Department of Engineering and the Department of Engineering Technology in the College of Technology and Innovation at Arizona State University (ASU). Prior to joining ASU as an Associate Professor of engineering, she served as a Program Officer at the National Science Foundation in the Division of Undergraduate Education, and was on the faculty in the Department of Mechanical Engineering and Segal Design Institute at Northwestern University. McKenna’s research focuses on understanding the cognitive and social processes of design, design teaching and learning, the role of adaptive expertise in design and innovation, the impact and diffusion of education innovations, and teaching approaches of engineering faculty. McKenna received her B.S. and M.S. degrees in mechanical engineering from Drexel University and Ph.D. from the University of California, Berkeley. McKenna also serves as an Associate Editor for the Journal of Engineering Education.

visit author page

author page

Odesma Onika Dalrymple Arizona State University, Polytechnic

Download Paper |

Abstract

Eliciting Students’ Interpretations of Engineering RepresentationsAbstractAssessing what students have truly learned from an intervention can be quite challenging.Instructors use many different methods to assess learning including homework, exams,demonstrations, discussions, observations, and interviews. No matter what approach isimplemented, the instruction and the wording associated with these assessment methods iscritical to the accuracy of the assessment. Student interpretations of what the instructor is askingfor can drastically alter their response to the assessment to the point where a poorly posedassessment may not assess the students’ understanding of the intended context at all. Criticallyrelated to assessment design is how students respond to the assessment. That is, do they useeffective representations and/or models to answer the question or do they default to writtenresponses.The following study investigates assessment design and student representations by analyzingstudent responses to similarly purposed questions. Sixty-four students enrolled in amultidisciplinary engineering program were asked as part of their required sophomore-levelproject-based course to represent their understanding of how the DC voltage measurementfunction (Figure 1) works in a multimeter that they assembled. Students were first asked tosubmit for homework their response to the following question: Describe how the DC voltage measurement function works.In the follow-up class, students were then asked to respond to the following question: What evidence would you present to convince someone that the DC voltage measurement works?Both questions were designed to assess how well students understood the DC voltagemeasurement function of their multimeter with a focus on understanding how students representthat understanding, i.e., descriptions, drawings, or mathematical equations. This study exploresand compares the differences between student responses when they are asked to “describe” howsomething works and when they are asked to provide “evidence to convince someone” thatsomething works.Responses were collected from the two assignments and analyzed to identify if the changes inwording caused students to respond and interpret the questions differently. A preliminarycomparative analysis of the overall class responses indicates that most students provideequivalent responses to the two questions. However, some students identify a difference andpresent different ideas and alternative representations for their answers.The results of this study are intended to assist faculty in appropriately assessing student learningand to provide a vehicle for introducing effective use of representations, specifically models, todescribe how something works.Figure 1: Simplified DC voltage measurement diagram for the Model M-10005 Digital Multimeter Kit, Elenco Electronics, Inc.

Carberry, A. R., & McKenna, A. F., & Dalrymple, O. O. (2012, June), Eliciting Students' Interpretations of Engineering Representations Paper presented at 2012 ASEE Annual Conference & Exposition, San Antonio, Texas. 10.18260/1-2--21271

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2012 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015