Asee peer logo

Implementing Calibrated Peer Review To Enhance Technical Critiquing Skills In A Bioengineering Laboratory

Download Paper |


2008 Annual Conference & Exposition


Pittsburgh, Pennsylvania

Publication Date

June 22, 2008

Start Date

June 22, 2008

End Date

June 25, 2008



Conference Session

Instructional Methods and Tools in BME

Tagged Division


Page Count


Page Numbers

13.708.1 - 13.708.13



Permanent URL

Download Count


Request a correction

Paper Authors


Ann Saterbak Rice University

visit author page

Ann Saterbak is Director of Laboratory Instruction and Lecturer in the Bioengineering Department at Rice University. Dr. Saterbak teaches laboratory, lecture and problem-based learning courses. She is the lead author of the textbook, Bioengineering Fundamentals, published in 2007 by Prentice Hall. She received her B.A. in Chemical Engineering and Biochemistry from Rice University in 1990 and her Ph.D. in Chemical Engineering from the University of Illinois in Urbana-Champaign in 1995.

visit author page


Tracy Volz Rice University

visit author page

Tracy Volz is the Assistant Director and an award-winning instructor for the Cain Project in Engineering and Professional Communication at Rice University. She supports written, oral, and visual communication instruction in science and engineering courses. In addition to working with students, Dr. Volz has conducted communication seminars about oral presentations, interviewing, and technical poster design for the Texas Society of Professional Engineers and Baylor College of Medicine.

visit author page

Download Paper |

NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Implementing Calibrated Peer Review™ to Enhance Technical Critiquing Skills in a Bioengineering Laboratory


Developed at UCLA, Calibrated Peer Review™ (CPR) is a web-based tool developed to help students improve their technical writing and critiquing skills. In 2006 and 2007 we used CPR in an upper-level tissue culture laboratory course in which students conduct viability, attachment, and proliferation assays using fibroblast cells. After completing their experiments, students use PowerPoint to construct a technical poster that illustrates their experimental methods, results, and conclusions.

For the CPR component of the assignment, students first evaluate three sample posters supplied by the instructor to calibrate their critiquing skills. After this step, students conduct a blind review of three peers’ posters and then evaluate their own. During the calibration, peer critiquing and self evaluation stages, students respond to 15 statements about the quality of the posters. Eleven statements cover technical content, including succinct summary of objectives, clear experimental methods, quality of graphs, and key results interpreted in words. Three statements probe the poster’s visual appeal, including appropriate size and style of font. One final statement requires a holistic evaluation of the poster. Following CPR, students turn in a revised copy of their technical poster.

In 2006, students had difficulty during the calibration phase. Following a major revision of the calibration phase in 2007, 79% of students passed all three calibration posters. Instructor, peer, and self evaluations were compared. There was a strong linear correlation between instructor evaluation and peer evaluation (r = 0.60, regression model ANOVA P<0.0002). In contrast, there were poor linear correlations between instructor and self evaluations and between peer and self evaluations (r < 0.25, regression model ANOVA P>0.2). These results suggest that students may be better able to technically evaluate others’ work, rather than their own. Students perceived the peer evaluation process as generally helpful, although they noted that their peers’ comments were less specific and occasionally inconsistent with their instructor’s feedback. Students reported on surveys that peer evaluation was effective in helping them to recognize many facets of technical poster design, such as errors and omissions, data presentation, and technical argument. 97% of the students claimed their technical critiquing skills improved as a result of this experience. We feel that using CPR to facilitate the peer evaluation process is an effective way to enhance undergraduate engineering students’ technical critiquing skills.

Introduction to Calibrated Peer Review™ (CPR)

Developed at UCLA in 1995, CPR promotes active learning through writing and models the peer review process in science and engineering disciplines ( The National Science Foundation and the Howard Hughes Medical Institute provided initial funding for CPR, and it has been used at over 500 academic institutions.1 According to Orville Chapman, CPR’s creator, the tool enables students to “develop key skills such as abstracting, persuading

Saterbak, A., & Volz, T. (2008, June), Implementing Calibrated Peer Review To Enhance Technical Critiquing Skills In A Bioengineering Laboratory Paper presented at 2008 Annual Conference & Exposition, Pittsburgh, Pennsylvania. 10.18260/1-2--3128

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2008 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015