Session 2630 Web Based Forms for Design Team Peer Evaluations Elizabeth A. Eschenbach1 and Marc A. Mesmer2 Humboldt State UniversityAbstractThis paper describes the use of web based forms for a peer review process used in teachingENGR 111: Introduction to Design and is a follow up of work reported at the 1997 ASEEmeeting: Using Peer Evaluations for Design Team Effectiveness. The paper describes thefunctionality of the web based software and provides examples of web based peer evaluationsforms, as well a summary of the training students receive on how to write a good peer
, without having to waituntil all students’ work has been graded. Indeed, peer assessment is one of the fewscalable approaches to assessment: as the amount of work to assess increases, theresources available for assessment increase proportionally.Perhaps the most frequent use of peer assessment is for teaching writing. Writing for anaudience of their peers forces them to explain themselves well enough so that they can be 1understood by non-experts. It also gives them the benefit of seeing and responding totheir peers’ reactions to what they write.Writing is important in engineering, of course. It is a good way for students to grapplewith ethical issues that arise in their professional development [5, 6
) as a publication and its review process, and 3) bestpractices in peer reviewing (i.e., organization, quality considerations, tips for writing reviews).Triads then attend a synchronous session together, and after an icebreaker activity and a briefoverview of the program, they conduct a mock review of a short, published manuscript togetheras a triad during the session. The mock review makes use of a Structured Peer Review form,which helps triads organize their reviews (strengths, weaknesses and recommendations) andprovides the team with insights on what participants are taking into consideration as they conducttheir review. (The Structured Peer Review form, which was developed by the project team, isshown in Figure 2.) The session concludes
project, each student has a 15 minute conference with the class instructor.During the conference, the student presents a team evaluation on a computer disk, discussing thestrengths and weaknesses of the team and all team members (including him or herself). Then thestudent and the instructor discuss ways to improve team productivity. The instructor gives thestudent hints on how to write a more descriptive evaluation.At the end of the semester, each team member turns in a self evaluation and peer evaluation ofall team members on a disk. The evaluations from all team members are combined and then splitinto summary evaluations, one for each team member. A summary evaluation is returned to eachteam member during the final period of the class. The
, the feedback comes more quickly. An author canusually see the feedback as soon as the reviewer provides it, rather than having to wait until theinstructor or TA is finished grading all the students. Finally, peer assessment forces students towrite in a way that their peers can understand. They can’t use shorthand that the instructor, withhis/her superior knowledge, is expected to decipher. They learn to write for an audience of theirpeers, which is exactly the skill they need for later in their careers. Peer assessment has beenshown to improve learning across the curriculum [1].Online peer-assessment systems perform the same basic functions, though they often havefeatures aimed at the types of courses taught by their designers, e.g., art
gradingevent.)Also, before the next class Armani will try the new assignment. A diligent Armani will refer tothe textbook and find help as needed and invent ways to check answers. Students like that wouldprobably thrive under any form of instruction. However, maybe Armani will skip the assignedreading in an attempt to save time. Some answers will be correctly found, but many will not. IfArmani does not have enough time or perseverance to finish well, the peer grading rubric willencourage Armani to at least think about and write something down for each problem. Also, Proceedings of the 2010 ASEE North Midwest Sectional Conference
education.It includes both discipline-specific resources (e.g., demonstrations, tutorials, on-line experiments,course notes) and more general resources for educational research and improvement (e.g.,guidelines for writing and assessing student learning outcomes). Although other databases exist,MERLOT is unusual because it includes a system for peer review. Editorial boards assignobjects already in the database to reviewers with relevant technical expertise. Reviewers’comments on technical content, ease of use, and educational potential are then displayed in thedatabase along with the link to the learning object as well as suggestions for how to incorporatethe learning object into a course. The MERLOT engineering editorial board is actively
Session 3530 Comparing the Reliability of Two Peer Evaluation Instruments Matthew W. Ohland, Richard A. Layton University of Florida / North Carolina A&T State UniversityAbstractThis paper presents an analysis of student peer evaluations in project teams to compare thereliability of two different evaluation procedures. The project teams consist of junior-levelstudents in a mechanical engineering design course taught by Layton for five semesters in 1997,1998, and 1999.The peer-evaluation instruments were used by students to evaluate their teammates’contributions to the team’s deliverables—oral and
meaningful feedback to your peer related to his or her syllabus. Providemeaningful feedback to your peer related to classroom observations of his or her teachingstrategies. Provide meaningful feedback to your peer related to the evidence of student learningthat your peer collects from his or her students.Task 2: Attend group meetings with your PRT leader.Task 3: Write three reflective essays per semester, based on your goals and feedback fromyour peer. The essays must be completed no later than the last day of the semester. The threeessays (not to exceed one page) should be based on: 1.) discussions with your peer related toyour syllabus or outcomes for the class that is being reviewed, 2.) discussions with your peerrelated to the teaching
Tutoring between pairs in the same point in the course. One person retains role of tutor throughout. Same-year dyadic reciprocal peer tutoring Tutoring between pairs in the same point in the course. Tutor role is reciprocated between pairs. Dyadic cross-year fixed-role peer tutoring Tutor has a higher academic status than tutee. Same-year group tutoring Rotating presentations by individual students to the peer group. Peer assisted writing
foundational research in student retention and other evidence-based practices that engage, enroll, and graduate their women and BIPOC engineers.5. Professional Learning a. Provide a toolbox of resources to guide collaboration and partnerships at their respective institutions, with partners, and with each other (broader impact/broadening participation, proposal development, writing research papers, etc.). b. Expand PEERs’ understanding of national funding opportunities aligned with their institutional goals (NSF grants, national education grants, industry grants, etc). 1017
Paper ID #30221A Vertically Integrated Design Program Using Peer EducationDr. Ross Aaron Petrella, University of North Carolina and North Carolina State University Joint Departmentof Biomedical Engineering Dr. Petrella received his B.S. in biomedical engineering from Virginia Commonwealth University in Rich- mond, VA and his Ph.D. in biomedical engineering from Old Dominion University in Norfolk, VA. He joined the University of North Carolina and North Carolina State University Joint Department of Biomed- ical Engineering first as a postdoctoral research scholar and is now an assistant teaching professor where he teaches
AC 2012-4169: INTERDISCIPLINARY STEM PEER-MENTORING ANDDISTANCE-BASED TEAMSBrian F Martensen, Minnesota State University Brian F. Martensen is an Associate Professor and Chair of the Department of Mathematics and Statistics at Minnesota State University, Mankato. He began working with the NSF-supported MAX Scholar Program in 2009. His interests include inquiry-based models of instruction and ways to facilitate the transition of majors to professionals. His mathematical research is in the area of dynamical systems and topology.Dr. Deborah K. Nykanen P.E., Minnesota State University, Mankato Deborah K. Nykanen is an Associate Professor of Civil Engineering at Minnesota State University, Mankato. She received her Ph.D
progression of a student through the programprovides valuable opportunities for “stepping stone peer mentoring” and individual studentdevelopment. Our selection process addresses diversity issues by factoring in major, gender,year, eligibility for subsidized financial aid (a program requirement), community collegebackground and first-generation status. In addition, we ask students to write a brief essaydescribing how they will contribute to the program diversity given a broad definition thatincorporates such things as race, religion, socioeconomic status, and breadth of experience incommunities. We strive to select students who are motivated and who could have an improvededucational experience given the opportunity to be a member of the cohort, to
Paper ID #49714Enhancing Clinical Immersion Experience with Peer-Mentoring SupportTiffany Marie Chan, University of California, Davis Tiffany Chan is a 4th-year undergraduate student in biomedical engineering at UC Davis and the recipient of the 2024 ASEE-PSW Section Undergraduate Student Award. She actively contributes to the cube3 Lab, where her interests lie in community building and inclusive practices. Tiffany is involved in various DEI (Diversity, Equity, and Inclusion) research initiatives within the lab, including organizing student-faculty lunches and participating in the gender equity first-year seminar program
Paper ID #32312Bias in First-Year Engineering Student Peer EvaluationsLea Wittie, Bucknell University Lea Wittie is an Associate Professor in the department of Computer Science in the Engineering College at Bucknell University. She has spent the past 4 years coordinating the first year Engineering student Introduction to Engineering and over a decade participating in the program before that.James Bennett, Cornell University James Bennett is a biomedical engineer specializing in medical device design and development. He has earned a Bachelor of Science Degree in Biomedical Engineering from Bucknell University and is currently
2006-1382: PEER ASSESSMENT METHODOLOGIES FOR ALABORATORY-BASED COURSERathika Rajaravivarma, Central CT State University Page 11.987.1© American Society for Engineering Education, 2006Peer Assessment Methodologies for a Laboratory-Based CourseAbstractAdvances in technology and the explosive growth of the Internet have called fornew ways of learning environment. The content delivery is no longer the passiveapproach of lecture emanating from the teacher to the student. It is imperativethat computer networking courses taught at the undergraduate level containadequate hands-on implementation based projects and experiments in order tobetter train students. The computing curricula 2001 (CC2001
Session 2330 Peer Evaluations in Teams of Predominantly Minority Students Richard A. Layton, Matthew W. Ohland North Carolina A&T State University / University of FloridaAbstractThis paper presents an analysis of student peer evaluations in project teams where the majority ofthe students are African-American. Peer evaluations were used to assign individual grades fromgroup grades for design projects in a junior-level mechanical engineering course taught byLayton for three semesters in 1997-99. This study is similar to and complements a 1999 study byKaufman, Felder, and Fuller. The results of the two
Session 2230 Peer Ratings Revisited: Focus on Teamwork, Not Ability Richard A. Layton, Matthew W. Ohland Rose-Hulman Institute of Technology / Clemson UniversityAbstractIn a previous study, we determined that student peer ratings used to assign individual gradesfrom group grades showed no effects relating to gender but significant effects relating to race. Alikely explanation of this result was that students seem to base ratings on perceived ability in-stead of real contribution to the group effort. To overcome this tendency, we modified the peer-rating instrument, instructed students on the
research focuses on creating inclusive and equitable learning environments through the development and implementation of strategies geared towards increasing student sense of belonging.Audrey Boklage (Dr.)Madison E. Andrews © American Society for Engineering Education, 2022 Powered by www.slayte.com Peer Mentors Forging a Path in Changing Times “When I first started thinking about inclusivity, I recognized that I wanted to share what I was learning. I also want to spread word about my department and even more I want to spark more interest for STEM and/or engineering, keep working on inclusive practices, and work on
Good Teaching: As Identified by Your PeersAbstract:The literature on teaching is replete with definitions and examples of good teaching. Theyinclude the traits and characteristics of the best instructor/teacher/professor. They have examplesof methods and results of surveys that quantify teaching: bad or good. In recent years, theliterature included the impact of teaching on the student learner; thus, coming full circle, fromteacher to learner. The literature provides good information, but it is the analysis of the currentclassroom experience of one’s peers that provides reliable information on the teaching of today’sstudents.Since 1998, over 1000 faculty have pondered over 5 questions concerning good teaching. Theyhave pair-shared the results
slightly aware that someone is going to have to mark their work and Idid witness some students think about how they lay it out and are aware they will lose marksfor insufficient working. So hopefully this ended in them constructing better answers in testsand exams.” “The student learning did improve as a result of peer marking exercise as it allows them toknow how others think”. “It forces the students to grasp the material at early stage of (the) course which results inbetter understanding of the course.” “I marked (a) few exams and found that most of the students did write the UNITS of thequantities in (their) solution. It was definitely due to peer-marking exercise.” “I think peer marking exercise is a good practice to do and it adds an
. Morgantown: West Virginia University Press, 2019.[18] “FERPA | Protecting Student Privacy.” https://studentprivacy.ed.gov/ferpa (accessed Apr. 07, 2021).[19] USPTO, “USPTO Manual of Patent Examining Procedure, Title 35 U.S.C. 102 Conditions for patentability; novelty.,” Manual of Patent Examining Procedure. https://www.uspto.gov/web/offices/pac/mpep/mpep-9015-appx- l.html#al_d1d85b_11e72_2ee (accessed Apr. 07, 2021).[20] R. Lu and L. Bol, “A Comparison of Anonymous Versus Identifiable E-Peer Review On College Student Writing Performance and the Extent of Critical Feedback,” p. 17.[21] C. Bauer, K. Figl, M. Derntl, P. P. Beran, and S. Kabicher, “The student view on online peer reviews,” in Proceedings of the 14th annual ACM
Paper ID #242512018 CoNECD - The Collaborative Network for Engineering and ComputingDiversity Conference: Crystal City, Virginia Apr 29A Review of Bias in Peer AssessmentJacklin Hope Stonewall, Iowa State University Jacklin Stonewall is a Ph.D. student in the Departments of Industrial and Manufacturing Systems Engi- neering and Human Computer Interaction at Iowa State University. Her research interests include: gender HCI, decision support systems, sustainability, and the creation of equitable cities and classrooms.Prof. Michael Dorneich, Iowa State University Dr. Michael C. Dorneich is an Associate Professor at Iowa State
Session 2630 Dynamics of Peer Interactions in Cooperative Learning Cynthia R. Haller, Victoria J. Gallagher, Tracey L. Weldon, Richard M. Felder North Carolina State UniversityAbstractAlthough many recent studies demonstrate that cooperative learning provides a variety ofeducational advantages over more traditional instructional models, little is known about theinteractional dynamics among students in engineering workgroups. We explored these dynamicsand their implications for
individual and group-based activities that aredesigned to prepare the students for the upcoming summative assessments. The first-half of thesemester focuses on technical writing and how to represent complex engineering ideas withvisuals and written descriptions. The second half of the course focuses on down selecting fromall the creative concepts in the individually-generated Idea Notebooks to one that will bepresented as part of the Rocket Pitch. Then the students are given three weeks to work withinthe discussion section on their provisional patent applications.3.0 Research Design The project entails data collection at multiple levels that attend to the course design, pedagogy,and classroom environment that affect the students
students develop complex theory papers starting with "low-stakes" writing activities that leads to "high-stakes" formal papers. This process incorporates acontinuous improvement plan that uses several types of peer review. A campus-wide committee,referred to as the Writing in the Discipline Committee, also reviews and approves thepedagogical writing process used in the course. Student survey data is presented to measurestudent attitudes and perceptions. Sample grades are presented to show trends. Analysis,recommendations and conclusions are given. The goal here is to present a useful case study forfaculty interested in teaching a writing intensive or WID course.BackgroundThere are two important background points that should be made. One, what type
Paper ID #23278Successes and Challenges in Supporting Undergraduate Peer Educators toNotice and Respond to Equity Considerations within Design TeamsDr. Chandra Anne Turpen, University of Maryland, College Park Chandra Turpen is a Research Assistant Professor in the Physics Education Research Group at the Uni- versity of Maryland, College Park’s Department of Physics. She completed her PhD in Physics at the University of Colorado at Boulder specializing in Physics Education Research. Chandra’s work involves designing and researching contexts for learning within higher education. In her research, Chandra draws from the
further develop students’ technical writing skillsthroughout the semester by introducing a three-part strategy: (1) Focused instruction time –Allocating select times throughout the semester to focus on one section of lab report; (2)Reviewing samples as a group – determining which samples or attributes of samples wereeffective or ineffective; and (3) Peer review – Students reviewed each other’s lab reports andgave feedback. The goal of focused instructional time and reviewing samples was to allowstudents to improve their writing skills by focusing on one section of lab report at a time, andthus learning the writing techniques more effectively. The peer-review part of the strategy wasdesigned to draw students’ close attention to quality of writing
Journal to General: Teaching Graduate Engineering Students to Write for All AudiencesAbstract - The Accreditation Board for Engineering and Technology (ABET) identifies “anability to communicate effectively with a range of audiences” as a critical learning outcome forengineering programs. This underscores the importance of engineers learning to articulate theirideas clearly, not only to peers within their field but also to non-specialist audiences. Whilerecently developed generative AI tools offer support for crafting written documents, they are nota substitute for mastering the foundational skills necessary for clear and effective technicalcommunication. Moreover, students frequently find themselves unprepared for the