San Antonio, Texas
June 10, 2012
June 10, 2012
June 13, 2012
2153-5965
Civil Engineering
14
25.230.1 - 25.230.14
10.18260/1-2--20990
https://peer.asee.org/20990
590
Kevin Sutterer is pofessor and Head of civil engineering at Rose-Hulman Institute of Technology in Terre Haute, Ind. He received B.S. and M.S. degrees in civil engineering at the University of Missouri, Rolla, a second M.S. in civil engineering at Purdue University, and a Ph.D. from Georgia Institute of Technology. Although his specialization is geotechnical engineering, he has consulted in environmental and structural engineering as well and currently teaches courses in geotechnical and structural engineering. Sutterer was a geotechnical consultant with Soil Consultants, Inc. of St. Peters, Mo., from 1984-1988. He also served as Director of Engineering Services for SCI Environmental of Chesterfield, Mo., from 1988-89 before leaving practice to pursue his Ph.D. Sutterer was an Assistant Professor at the University of Kentucky from 1993-1998, and has been a faculty member at Rose-Hulman since then. He is currently Director of Rose-Hulman’s Engineering Forensics Research Institute, and he continues to do some consulting along with his other academic duties. Sutterer has served the Civil Engineering Division of ASEE for nearly 10 years and was Division Chair in 2010-11. He has also served on numerous ASCE committees. In addition to receiving numerous teaching awards over the years, he was selected by Kentucky Society of Professional Engineering and National Society of Professional Engineers as their 1996 Young Engineer of the Year.
James Hanson is an Associate Professor of civil engineering at the Rose-Hulman Institute of Technology, where his teaching emphasis is structural analysis and design. He is a member of Rose-Hulman’s Commission on the Assessment of Student Outcomes, and has been rating student portfolios for more than eight years.
Assessment of Student Outcomes Using Industry-Academia Assessment TeamsAbstract(school name’s) Department of Civil Engineering is using assessment teams comprised ofindustry professionals and faculty members working together to assess student outcomes forcontinuous improvement. This industry assessment is using one of two approaches. For the firstapproach, assessment of some student outcomes is performed by teams of four industry expertsduring (school name)’s annual Board of Advisors’ meeting. This assessment is conductedspecifically on senior capstone design reports from the prior academic year. In this approach,faculty members are available to answer questions about the students’ work and to receiveadvice, but not to assess. The department rates six different student outcomes using thisapproach.The second assessment approach is conducted on all other student work submitted forassessment of department-specific student outcomes. In a single year, this requires rating a totalof approximately 30 different sets of submissions from students. This assessment is facilitatedusing (school name)’s online electronic portfolio system to allow remote access and rating ofstudent work. Each industry professional is teamed with one faculty member to conduct rating ofstudent work submissions. The teams meet by phone and email regularly during each ratingsession to discuss the outcome criterion, student submissions, the rubrics for rating submissions,and inter-rater reliability. Upon completion of each rating session, the team provides thedepartment with an overview that includes advice for improving student learning and forcriterion or rubric revision, if appropriate.Permitting industry professionals to work directly with student submissions has accelerated thecontinuous improvement process in the department. External industry professionals are likely toapply an even higher standard of expectation to student work, and provide insights not readilyapparent to faculty members who are immersed daily in facilitating the learning process. Thishas resulted in a reduction in passing rates for some student work, thus fostering greater leaps inimprovement of learning in those outcomes. Team review of student work also facilitates greaterlevels of cooperation and communication between faculty members and industry colleagues,ultimately enhancing student learning.Findings will be reported as (1) a comparison of passing rate statistics before and after inclusionof industry raters (2) +/∆ reflections on the process by both industry and faculty raters, and (3)+/∆ reflection on the process by the administrator of the rating. We recommend that otherinstitutes consider use of industry raters for student outcomes because of the acceleratedcontinuous improvement and increased collaboration between industry and academia. Programsare cautioned that inclusion of industry raters adds another dimension to the planning thatincreases the administrative burden, and that passing percentages for student work will likelydecrease when industry raters are included.
Sutterer, K. G., & Robinson, M., & Hanson, J. H., & Reeves, M. C., & Twarek, A. B. (2012, June), Assessment of Student Outcomes Using Industry-Academia Assessment Teams Paper presented at 2012 ASEE Annual Conference & Exposition, San Antonio, Texas. 10.18260/1-2--20990
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2012 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015