Asee peer logo

The Nature of Peer Feedback from First-year Engineering Students on Open-ended Mathematical Modeling Problems

Download Paper |

Conference

2012 ASEE Annual Conference & Exposition

Location

San Antonio, Texas

Publication Date

June 10, 2012

Start Date

June 10, 2012

End Date

June 13, 2012

ISSN

2153-5965

Conference Session

FPD I: Research on First-year Programs Part I

Tagged Division

First-Year Programs

Page Count

23

Page Numbers

25.1323.1 - 25.1323.23

DOI

10.18260/1-2--22080

Permanent URL

https://peer.asee.org/22080

Download Count

359

Request a correction

Paper Authors

biography

Kelsey Joy Rodgers Purdue University Orcid 16x16 orcid.org/0000-0003-2352-3464

visit author page

Kelsey Rodgers is a graduate student at Purdue University in the School of Engineering Education. She is currently conducting research on peer feedback within model-eliciting activities (MEAs) in the First-year Engineering program with her advisor, Professor Heidi Diefes-Dux. Prior to attending Purdue, she graduated from Arizona State University with her B.S.E in engineering from the College of Technology and Innovation. She began her research in engineering education on disassemble, analyze, assemble (DAA) activities with her previous advisor at Arizona State University, Professor Odesma Dalrymple.

visit author page

biography

Heidi A. Diefes-Dux Purdue University, West Lafayette Orcid 16x16 orcid.org/0000-0003-3635-1825

visit author page

Heidi A. Diefes-Dux is an Associate Professor in the School of Engineering Education at Purdue University. She received her B.S. and M.S. in food science from Cornell University and her Ph.D. in food process engineering from the Department of Agricultural and Biological Engineering at Purdue University. She is a member of Purdue’s Teaching Academy. Since 1999, she has been a faculty member within the First-year Engineering program at Purdue, the gateway for all first-year students entering the College of Engineering. She has coordinated and taught in a required first-year engineering course that engages students in open-ended problem solving and design. Her research focuses on the development, implementation, and assessment of model-eliciting activities with realistic engineering contexts. She is currently the Director of Teacher Professional Development for the Institute for P-12 Engineering Research and Learning (INSPIRE).

visit author page

author page

Monica E. Cardella Purdue University, West Lafayette Orcid 16x16 orcid.org/0000-0002-4229-6183

Download Paper |

Abstract

The Nature of Peer Feedback from First-Year Engineering Students on Open-Ended Mathematical Modeling ProblemsThe ability to give adequate and quality peer feedback on open-ended problems is a learned skillthat is utilized throughout students’ education, engineers’ roles in industry, and engineeringresearchers’ roles in research communities. Not only is it important for engineering students tobe able to give effective peer feedback to function in their present and future community, it isalso vital for their education that they have the effective communication skills, the problem-solving skills, and professional responsibility to conduct the feedback. These three skills areaspects of the ABET (Accreditation Board of Engineering and Technology) student outcomerequirements for engineering programs. Peer feedback in the implementation sequence of Model-Eliciting Activities (MEAs) in a large required first-year engineering course is not onlybeneficial to students, it is a vital step in the sequence of improving students’ answers to open-ended problems, since the available resources only allows for one draft review from teacherassistants (TAs). MEAs are team-oriented, open-ended mathematical problems that are based onthe models and modeling perspective. MEAs require iteration with feedback for student teams tosuccessfully address the complexity of the problem. Double-blind peer reviews were integratedinto the MEA implementation sequence to give the students an opportunity to start building theirpeer feedback skills and enable the students to have multiple perspectives on how to improvetheir solution to the given MEA problem.Although peer feedback is a vital skill in engineering communities and an important aspect ofstudent education, it was found through student interviews that many students have a lack ofrespect for peer feedback and the changes to their MEA solutions based on peer feedback areminimal. Peer feedback needs to have greater impact – resulting in more substantive changes infinal team solutions. To determine how to improve the impact of peer feedback, there are twoprimary questions regarding the quality and nature of peer feedback that first need to be addressed.These questions are: (1) How do peer reviewers scores compare to expert scores on student teamMEA solutions? and (2) What aspects of evaluation do students focus on in peer feedback?.MEAs are completed in teams of 3 to 4 students by ~1500 first-year engineering students. Teamsreceived TA feedback on their first MEA draft solutions and peer feedback on their second MEAdraft solutions. The peer feedback data collected during one MEA implemented in a fall semesterwas analyzed using a mixed-methods approach. All feedback is given using the MEA gradingrubric, which evaluates student team solutions along three dimensions: Mathematical Model, Re-Usability & Modifiability, and Audience (Share-Ability). The evaluation includes a quantitativeassessment (using a 1 to 4 numeric scale) of each dimension and written feedback on eachdimension. Out of ~1500 students enrolled in the course (~400 teams), 60 randomly selectedteams’ (237 students) draft solutions were graded by an expert. These results were quantitativelycompared to the students’ scoring in the peer feedback to determine the difference between peerand expert scoring. Out of these 60 teams, the peer feedback received by 11 teams (42 students)was qualitatively analyzed to understand the nature of the peer feedback – open coding was doneto elicit themes in feedback for each of the three MEA Rubric dimensions.The quantitative analysis showed slight variation in students’ numeric scoring on the Re-usability&Modifiability and Share-ability dimensions. However, the Mathematical Model was rated by peersan average of 1.5 points higher than the expert. The quantitative analysis shows that the peers donot fully grasp how to critique the mathematics in an open-ended problem solution. The qualitativeanalysis further confirms this, since the majority of the written peer feedback only addresses theRe-usability& Modifiability and Share-ability MEA Rubric dimensions.While peer feedback appears sufficient on aspects that relate to communication of the model,students need to improve their ability to critically evaluate open-ended mathematical solutions.The lack of peer review on the mathematical aspects of draft solutions could be due to lack offamiliarity with less structured solutions, lack of ability to give advice on how to improve asolution, and/or lack of ability to interpret others’ work and determine a need for change. Withthe knowledge that engineering students are capable of critiquing their peers’ communicationskills, it is important to turn attention to the development of instruction to better preparestudents’ to evaluate mathematical models. Research on the impact of instructional changes willbe needed to identify effective methods for engaging students more fully in peer review.

Rodgers, K. J., & Diefes-Dux, H. A., & Cardella, M. E. (2012, June), The Nature of Peer Feedback from First-year Engineering Students on Open-ended Mathematical Modeling Problems Paper presented at 2012 ASEE Annual Conference & Exposition, San Antonio, Texas. 10.18260/1-2--22080

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2012 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015