Tampa, Florida
June 15, 2019
June 15, 2019
June 19, 2019
Experimentation and Laboratory-Oriented Studies Division Technical Session 5
Experimentation and Laboratory-Oriented Studies
13
10.18260/1-2--32882
https://peer.asee.org/32882
524
Dr. Smith is an Associate Professor at the University of Virginia.
Laboratory courses, and in particular laboratory reports, are logical choices to assess two particular student outcomes: ‘the ability to design and conduct experiments, as well as to analyze and interpret data;’ and ‘the ability to communicate effectively.’ If students can articulate a clear objective, demonstrate a sound experimental procedure, and offer analysis supporting reasoned conclusions, they will have demonstrated proficiency in both outcomes. However, though assessing these outcomes may be straightforward, actually teaching these skills can be a time-intensive challenge, particular when dealing with large sections. Simply engaging a single draft of one report to provide meaningful feedback can easily take 30 minutes (50 hours per 100 papers), if not more. Allowing groups reports can reduce the workload, but may not ensure that everyone is gaining the same level of practice. Delegating this job to teaching assistants is another option but can lead to issues with consistency. A final option is to leverage peer feedback, which has some obvious benefits and significant challenges.
This paper discusses the implementation of a guided peer review process in a junior-level experimental methods course with over 120 students. In the course format, students meet all together for two weekly lectures and then divide into seven sections to for a weekly 2-hour lab supervised by teaching assistants. The guided process was designed to improve writing instruction, feedback, and evaluation using a beam deflection experiment. Groups of three were responsible for designing and implementing an experimental procedure within the constraints of broad guidance and available resources. However, each student submitted a separate report. Four lecture periods were devoted to help them develop their reports including instruction on a report template and three focused writing workshops. This was followed by a draft submittal and two-stage blind peer review process. For the initial peer review, reviewers were guided by tasks that required they locate and restate key ideas from the paper prior to identifying specific weaknesses. For example, reviewers were required to underline the technical objective, circle the control variable(s), and box the response(s). For the second draft, the report author assessed which of their main points were not successfully communicated, made corrections, and provided a rebuttal statement. The reviewers then assessed the resubmission, rated the reports, and provided minor suggestions for improvement.
The paper will include details on the experiment and the guided peer review process, as well as logistical solutions to achieve the blind peer review. In addition, results from a metacognition survey given to students will also be shared.
1 ABET Criteria for Accrediting Engineering Programs, 2018-2019. General Criterion 3: Student Outcomes (b) and (g).
Smith, N. (2019, June), Guided Peer Review of Technical Writing for Large Laboratory Course Paper presented at 2019 ASEE Annual Conference & Exposition , Tampa, Florida. 10.18260/1-2--32882
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2019 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015