Asee peer logo

Exploring the Value of Peer Assessment

Download Paper |

Conference

2016 ASEE Annual Conference & Exposition

Location

New Orleans, Louisiana

Publication Date

June 26, 2016

Start Date

June 26, 2016

End Date

August 28, 2016

ISBN

978-0-692-68565-5

ISSN

2153-5965

Conference Session

Software Engineering Constituent Committee Division Technical Session 3

Tagged Division

Software Engineering Constituent Committee

Page Count

10

DOI

10.18260/p.26872

Permanent URL

https://peer.asee.org/26872

Download Count

459

Request a correction

Paper Authors

biography

Sally Sue Richmond Pennsylvania State University, Great Valley

visit author page

Sally Sue Richmond is a Lecturer in Information Science at the School of Graduate Professional Studies, Penn State Great Valley. Richmond has a B.A. in Art and an M.S. in Information Science from The Pennsylvania State University. She has 25+ years experience in industry as a software developer, network analyst, trainer, and Help Desk supervisor. She teaches courses in Human-Computer Interaction, Computer Organization and Design, Computer Forensics, Microprocessors and Embedded Systems, Networking, and IS Architecture. She has published articles in conference proceedings and journals in the areas of concept mapping, cognitive style, and engineering education.

visit author page

biography

Kailasam Satyamurthy Penn State University

visit author page

Dr. Kailasam Satyamurthy is an Assistant Professor in Engineering at Penn State University. He earned his Ph.D. in Engineering Mechanics from Clemson University and an MBA from Penn State. Before joining Penn State, he was a senior manager at Vanguard for 8 years and head of the engineering department at GenCorp for 20 years. He teaches Decision and Risk Analysis, Business Statistics, Finance and Economics for Engineers, Quantitative Methods in Finance and Quality and Continuous Improvement courses at Penn State. At GenCorp, he did extensive research in the mathematical modeling and developed methodologies and algorithms for the nonlinear finite element analysis of mechanical systems under mechanical and thermal loadings. He is also a six sigma master blackbelt and trained numerous professionals in manufacturing, transactional and healthcare industries.

visit author page

biography

Joanna F. DeFranco Pennsylvania State University, Great Valley

visit author page

Joanna F. DeFranco is Assistant Professor of Software Engineering in the School of Graduate Professional Studies, Penn State Great Valley. Dr. DeFranco holds a B.S. in Electrical Engineering from The Pennsylvania State University, a M.S. in Computer Engineering from Villanova University and a Ph.D. in Computer and Information Science from New Jersey Institute of Technology. She teaches in both the resident and online software engineering, systems engineering, and engineering management graduate degrees. She has published a number of articles in journals and conference proceedings in the area of technical teams and engineering education.

visit author page

Download Paper |

Abstract

Student assessment can be complex task for an engineering instructor. Instructors can easily measure a student’s declarative knowledge with a written exam. However, engineering students assuredly must also be able to gain procedural knowledge from their courses to succeed in our competitive workforce. In general, projects are used to measure the procedural knowledge. Assessments of software engineering projects pose many challenges. The difficulty stems from the fact that a single software requirement may be expressed with varying degrees of complexity. For example, in a software construction course where students develop a software system, one requirement may be to capture user input about the client portion of the system. Students may capture this requirement with varying degrees of complexity based on their development experience. Since there is no single correct answer, instructors may utilize additional modes of assessment for this project, such as peer assessment. Some instructors may hesitate from using student peer evaluation to assess projects because they think the reviews may be biased. However, research has shown that self assessment and peer assessment are more effective than instructor formative assessment [1].

We have collected peer-assessment and self-assessment data from a resident section of a software construction course. This course is a core requirement in a graduate program in software engineering at a large research university. We are evaluating the peer assessment data to determine the effectiveness of the peer assessment process as compared to a formative assessment by the instructor.

1. De Sande, J.C.G., Godino-Llorente, J.I., “Peer Assessment and Self-Assessment: Effective Learning Tools in Higher Education,” International Journal of Engineering Education, Volume 30, No. 3, pp. 711-721, 2014.

Richmond, S. S., & Satyamurthy, K., & DeFranco, J. F. (2016, June), Exploring the Value of Peer Assessment Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.26872

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015