Asee peer logo

SMARTER Teamwork: System for Management, Assessment, Research, Training, Education, and Remediation for Teamwork

Download Paper |

Conference

2011 ASEE Annual Conference & Exposition

Location

Vancouver, BC

Publication Date

June 26, 2011

Start Date

June 26, 2011

End Date

June 29, 2011

ISSN

2153-5965

Conference Session

NSF Grantees Poster Session

Tagged Topic

NSF Grantees

Page Count

10

Page Numbers

22.1303.1 - 22.1303.10

Permanent URL

https://peer.asee.org/18978

Download Count

28

Request a correction

Paper Authors

biography

Matthew W. Ohland Purdue University, West Lafayette Orcid 16x16 orcid.org/0000-0003-4052-1452

visit author page

Matthew W. Ohland is Associate Professor of Engineering Education at Purdue University. He has degrees from Swarthmore College, Rensselaer Polytechnic Institute, and the University of Florida. His research on the longitudinal study of engineering students, team assignment, peer evaluation, and active and collaborative teaching methods has been supported by over $11.4 million from the National Science Foundation and the Sloan Foundation and his team received the William Elgin Wickenden Award for the Best Paper in the Journal of Engineering Education in 2008 and multiple conference Best Paper awards. Dr. Ohland is Chair of ASEE’s Educational Research and Methods division and an At-Large member the Administrative Committee of the IEEE Education Society. He was the 2002–2006 President of Tau Beta Pi.

visit author page

biography

Richard A. Layton Rose-Hulman Institute of Technology

visit author page

Richard A. Layton is the past Director of the Center for the Practice and Scholarship of Education and Associate Professor of Mechanical Engineering at Rose-Hulman Institute of Technology. He received a B.S. from California State University, Northridge, and an M.S. and Ph.D. from the University of Washington. His areas of scholarship include student team formation and peer evaluation, persistence, migration, and retention in engineering education, expanding the use of cooperative and active learning in engineering laboratories, data analysis and visualization for investigating and presenting quantitative data, and modeling and simulation of dynamic systems. He is a guitarist and songwriter with the rock band “Whisper Down”.

visit author page

biography

Daniel Michael Ferguson Purdue University, West Lafayette

visit author page

Daniel M. Ferguson is a graduate student in the Engineering Education Program at Purdue University. Prior to coming to Purdue he was Assistant Professor of Entrepreneurship at Ohio Northern University. Before assuming that position he was Associate Director of the Inter-professional Studies Program and Senior Lecturer at Illinois Institute of Technology and involved in research in service learning, assessment processes and interventions aimed at improving learning objective attainment. Prior to his University assignments he was the Founder and CEO of The EDI Group, Ltd. and The EDI Group Canada, Ltd, independent professional services companies specializing in B2B electronic commerce and electronic data interchange. The EDI Group companies conducted market research, offered educational seminars and conferences and published The Journal of Electronic Commerce. He was also a Vice President at the First National Bank of Chicago, where he founded and managed the bank’s market leading professional Cash Management Consulting Group, initiated the bank’s non credit service product management organization and profit center profitability programs and was instrumental in the EDI/EFT payment system implemented by General Motors.

visit author page

biography

Misty L. Loughry Georgia Southern University

visit author page

Dr. Loughry earned a Ph.D. in management from University of Florida in 2001. She also has an M.B.A. from Loyola College in Maryland and a B.A. from Towson State University. Before joining Georgia Southern University, she was a member of the faculty at Clemson University. Her research specialties are control in organizations, especially peer influences and other social controls, and teamwork, especially self and peer evaluation of teamwork. Prior to beginning her academic career, Dr. Loughry worked for ten years in the banking field, holding positions including credit analyst, branch manager and Assistant Vice President of Small Business Lending. Her research has been published in journals such as Organization Science, Educational & Psychological Measurement, Journal of Managerial Issues, Information and Management, Journal of Information Technology Management, Journal of Engineering Education, and Business Horizons.

visit author page

biography

David J. Woehr University of Tennessee

visit author page

David J. Woehr is a Professor in the Department of Management at the University of Tennessee, Knoxville. He received his Ph.D. in Industrial/Organizational Psychology from Georgia Institute of Technology in 1989. Dr. Woehr ‘s research focuses on the measurement and evaluation of individual job performance, managerial assessment centers, and applied measurement. Dr. Woehr currently serves as an associate editor for Human Performance and is an elected fellow of the Society for Industrial/Organizational Psychology (SIOP), the American Psychological Association (APA), and the Association for Psychological Science (APS).

visit author page

biography

Hal R. Pomeranz Deer Run Associates

visit author page

Hal Pomeranz is the lead developer of the SMARTER Teamwork tools. He is a Faculty Fellow of the SANS Institute and a nationally recognized expert in computer security and information systems management.

visit author page

Download Paper |

Abstract

SMARTER Teamwork: System for Management, Assessment, Research, Training, Education, and Remediation for TeamworkAbstractThe rapid adoption of Team-Maker and the Comprehensive Assessment of Team MemberEffectiveness (CATME), tools for team formation and peer evaluation, make it possible toextend their success to have a significant impact on the development of team skills in highereducation. The web-based systems are used by over 700 faculty at over 200 institutionsinternationally.This paper and its accompanying poster will describe strategies for broadening the scope of thosetools into a complete system for the management of teamwork in undergraduate education. TheSystem for the Management, Assessment, Research, Training, Education, and Remediation ofTeamwork (SMARTER Teamwork) has three specific goals: 1) to equip students to work inteams by providing them with training and feedback, 2) to equip faculty to manage student teamsby providing them with information and tools to facilitate best practices, and 3) to equipresearchers to understand teams by broadening the system’s capabilities to collect additionaltypes of data so that a wider range of research questions can be studied through a secureresearcher interface. The three goals of the project support each other in hierarchical fashion:research informs faculty practice, faculty determine the students’ experience, which, if wellmanaged based on research findings, equips students to work in teams. Our strategies forachieving these goals are based on a well-accepted training model that has five elements:information, demonstration, practice, feedback, and remediation.Different outcomes are expected for each group of people. For the students, both individualoutcomes, such as student learning, and team outcomes, such as the development of sharedmental models, are expected. For the faculty, individual outcomes such as faculty learning andfaculty satisfaction are expected. The outcomes for researchers will be community outcomes,that is, benefits for stakeholders outside the research team, such as generating new knowledge forteaming theory and disseminating best practices. Measuring these outcomes is the basis for theproject’s evaluation plan.Research Overview. The broad and deep scope of the proposed SMARTER Teamwork researchis summarized in Figure 1. The figure addresses the project’s three broad research goals, peopleimpacted, strategies for achieving the goals, and measureable outcomes. Goals. The proposed work has three goals: 1), equip students to work in teams; 2), equipfaculty to manage teams; and 3), equip this research team to understand student teams. Thesegoals support each other in hierarchical fashion: research informs faculty practice, facultydetermine the students’ experience, which, if well managed based on research findings, shouldequip students to work in teams. People. People are the groups that will use the proposed system: students, faculty, andresearchers. The hierarchy of people reflects the hierarchy of goals: the work of the researchteam supports the work of faculty, which in turn supports the work of students and their teams. GOALS OUTCOMES PEOPLE STRATEGIES INFORMATION 1. Why teams are important 2. What makes a good team member 3. Overview of SMARTER Teamwork toolkit INDIVIDUAL OUTCOMES DEMONSTRATION 1. Student learning 1. Frame of reference examples 2. Student satisfaction 2. Video-based modeling of team skills 3. Course grade 3. Sample vignettes with expert ratings 4. Individual grade for team work PRACTICE Equip students 1. Vignettes for rater calibration to work in teams Students 2. Students participate in teams TEAM OUTCOMES 1. Shared mental models FEEDBACK 2. Collective efficacy (potency) 1. Peer evaluation ratings 3. Cohesiveness 2. Team formation results 4. Climate 3. Exceptional conditions from all systems 5. Viability 4. Scenario rating accuracy 6. Conflict 5. Performance assessment REMEDIATION 1. Context-specific remediation 2. Redirection to relevant simulation exercise determines the INFORMATION environment for 1. Best practices in team management 2. References to research base 3. Overview of SMARTER Teamwork toolkit DEMONSTRATION 1. Sample feedback / system emails / tutorials / viewing student interface 2. Video-based modeling of team management INDIVIDUAL OUTCOMES 3. Sample vignettes of team management Equip faculty to 1. Faculty learning scenarios 2. Faculty satisfaction Faculty manage teams PRACTICE 3. Student reactions 1. Vignettes for practicing team management 2. Faculty manage student teams FEEDBACK 1. Peer evaluation ratings 2. Exceptional conditions from all systems 3. Scenario decision accuracy 4. Student team performance 5. Student reactions REMEDIATION 1. Context-specific remediation informs 2. Redirection to relevant simulation exercise INFORMATION 1. Using the system for research 2. Theoretical foundations / training model 3. Overview of SMARTER Teamwork toolkit DEMONSTRATION 1. Sample research interface / tutorial COMMUNITY OUTCOMES 1. Teaming theory 2. Testing new user interfaces 3. Viewing student and faculty interface Equip 2. Best practices Research 3. Faculty reactions researchers to 4. Number of users team PRACTICE understand teams 5. Publications 1. Researchers design / implement studies 6. Improved SMARTER toolkit FEEDBACK 1. Data from research interface 2. Research results and peer review 3. Faculty reactions REMEDIATION 1. Revising best practices 2. Revising the SMARTER toolkitFigure 1. System for the Management, Assessment, Research, Training, Education, and Remediation forTeamwork Strategies. For each group of people — students, faculty, and researchers — we developedstrategies for achieving our goals based on a well-accepted training model that has five elements:information, demonstration, practice, feedback, and remediation. By following this model, wewill enable the people affected by the system to become proficient in teamwork (all users), managingteamwork (faculty and researchers) and creating new knowledge about teamwork (researchers). Outcomes. Different outcomes are expected for each group of people. For the students, bothindividual outcomes, such as student learning, and team outcomes, such as shared mental models, areexpected. For the faculty, individual outcomes such as faculty learning and faculty satisfaction areexpected. The outcomes for the research team will be community outcomes, that is, benefits forstakeholders outside the research team, such as generating new knowledge for teaming theory anddisseminating best practices. Measuring these outcomes is the basis for the project’s evaluation plan.HIGHLIGHTS OF THE PAST YEAR • Continued growth of the CATME and Team-Maker user base; • System improvements, including repairs addressing usability concerns; • Progress toward development of the SMARTER system; • Development of material for training vignettes, including selection of video clips for training using video-based modeling and video vignettes, permissions for using the video clips has been granted; • Further progress on databases of literature on team formation and (separately) peer evaluation; • Multiple workshops promoting the system were conducted, with more scheduled.CONTINUED GROWTH OF THE CATME AND TEAM-MAKER SYSTEMS The growth in users of CATME and Team-Maker system has been substantial. Since October2005, 1042 instructors have registered to use the system at 296 different institutions to collectratings from 46,252 unique student users. As shown, system use has grown dramatically. 1100 46,252 unique  1000 students have used  900 the system since 2005 800 Number of faculty and  staff using CATME  700 Team Tools Sep Number 600 2010 of users 500 400 300 200 Oct 05 100 No. of institutions  using  CATME Team Tools 0 0 6 12 18 24 30 36 42 48 54 60 66 Months since the rollout of the CATME  System Figure 2. Growth in the Number of Faculty and Institutions using CATME Team Tools.The most recent growth in system use has introduced an interesting complication—as the userbase expands, it extends beyond “early adopters,” who are comfortable manipulating theinterface with little guidance. Rather, the most recent users are more likely to seek help gettingstarted, which can be quite time-consuming. Rather than divert resources to technical support, ausability study of the interface (scheduled as part of this project) has revealed opportunities tomake the interface more accessible to a broader audience.DEVELOPMENT OF MATERIAL FOR TRAINING VIGNETTESThe use of critical incident analysisThe development of training vignettes is a central strategy for this project. Our plan was to use acritical incident methodology to identify a wide variety of team behavior to include in thevignettes. Originally developed by Flanagan (1954), the critical incident technique gathersspecific, behaviorally focused descriptions of work or other activities. Bownas & Bernardin(1988) assert that “a good critical incident has four characteristics: it is specific, focuses onobservable behaviors exhibited on the job, describes the context in which the behavior occurred,and indicates the consequences of the behavior.” Thus, a good critical incident describesbehaviors, rather than traits or judgmental inferences. Normally, critical incident data arecollected by asking subject matter experts to describe particularly effective or ineffectivebehaviors from their experience, a content analysis identifies underlying dimensions ofperformance, and the critical incidents are rewritten to highlight the underlying dimensions thatwere found. In this work, a critical incident was used to develop the behaviorally anchored ratingscale for the CATME instrument as well as the sample vignette developed earlier. In this stage ofthe research, it is important to develop additional vignettes, but subject matter experts close tothis work were struggling to identify enough critical incidents to support the development of alarge pool of behaviors aligned with the dimensions of the CATME instrument.Identifying behavioral descriptions from student commentsThe research team has identified another source of behavioral descriptions that can be used forvignettes—from student comments about their teammates. Large numbers of peer evaluationshave been conducted, and the research team has access to a large volume of comments studentshave made about their teammates. These comments are a rich source of behavioral descriptions.A large volume of student comments has been processed by two undergraduate researchers todistill those comments down to essential behaviors. This task is ongoing and has been taken overby a graduate assistant at Purdue. This process requires: • Deleting non-behavioral comments (e.g., “Nice guy!” and “nothing to say, really.”); • Eliminating redundant phrasing to isolate a superset of unique behavioral descriptions; • Reducing all comments to the most basic elements representing a single behavior; and • Removing all names and pronouns.Building vignettes from individual behavioral commentsAs the comments are processed, graduate students at the University of Central Florida willconvert those behavioral elements into phrases that remain gender neutral, but are completesentences. Calibration ratings for each behavioral phrase will be determined by subject matterexperts. Where there is significant disagreement about the category to which a behavior isassigned or the rating level, behaviors will be deleted as ambiguous. In preliminary work, thesoftware developer has designed a system that will piece together a collection of behavioralphrases into a comprehensive vignette that spans all the behavioral dimensions measured byCATME.PROGRESS ON DATABASES OF LITERATUREDatabases of literature on both team formation and peer evaluation are being developed. Whilethese resources are being developed, these are for internal use only. As they near completion,they will be released and faculty who use the database will have the opportunity to proposeadditions. The team has concerns that such a literature database will quickly grow stale, as newwork emerges that must be added. The team discussed the ideal solution to this problem wouldbe an automated system that is trained to perform certain search tasks regularly to dynamicallyupdate the database. The development of such a system would be well beyond the scope of thisgrant, so the team will look for opportunities to leverage this work.PUBLICATIONS (Journal and Conference)• Ohland, Matthew W., Richard A. Layton, Misty L Loughry, Hal R. Pomeranz, Eduardo Salas, David J Woehr, “SMARTER Teamwork: System for Management, Assessment, Research, Training, Education, and Remediation for Teamwork,” American Society for Engineering Education 2010 Annual Conference.• Ohland, Matthew W., Lisa G. Bullard, Richard M. Felder, Cynthia J. Finelli, Richard A. Layton, Misty L. Loughry, Hal R. Pomeranz, Douglas G. Schmucker, David J. Woehr, “The Comprehensive Assessment of Team Member Effectiveness: Development of a Behaviorally Anchored Rating Scale for Self and Peer Evaluation,” Academy of Management 2010 Annual Meeting.• Ohland, M.W., M.L. Loughry, D.J. Woehr, C.J. Finelli, L.G. Bullard, R.M. Felder, R.A. Layton, H.R. Pomeranz, and D.G. Schmucker, “The Comprehensive Assessment of Team Member Effectiveness: Development of a Behaviorally Anchored Rating Scale for Self and Peer Evaluation.” In revision for Academy of Management: Learning & Education, March 26, 2010, Manuscript ID: AMLE-RR-2010-0056.• Layton, R.A., M.L. Loughry, M.W. Ohland, and G.D. Ricco, “Design and Validation of a Web-Based System for Assigning Members to Teams Using Instructor-Specified Criteria,” Advances in Engineering Education, 2(1), Spring 2010, pp. 1-28.• Zhang, B., and M.W. Ohland, “How to Assign Individualized Scores on a Group Project: an Empirical Evaluation,” Applied Measurement in Education, 22(3), 2009.• Layton, R.A., M.L. Loughry, and M.W. Ohland, “Design and Validation of a Web-Based System for Assigning Members to Teams Using Instructor-Specified Criteria,” accepted with revisions to Advances in Engineering Education, September 10, 2008, MS AAE-09-078.• Meyers, K., S. Silliman, M. Ohland, “Comparison of Two Peer Evaluation Instruments for Project Teams,” Proceedings of the American Society of Engineering Education Annual Conference, Pittsburgh, PA, June 2008.PRESENTATIONS• Ohland, M., “Managing Teams,” Project-Centered Learning Symposium, Cambridge -MIT Institute, http://web.mit.edu/cmi/ue/workshop2008/, March 18, 2008.• Layton, R.A., M.L. Loughry, M.W. Ohland, and H.R. Pomeranz, Assigning Students to Teams: Scholarship, Practice, and the Team-Maker Software System, ASEE/IEEE Frontiers in Education Saratoga, NY, October 22, 2008.• Pomeranz, H.R., Managing Student Teams Scholarship, Practice, and the Team- Maker/CATME Applications, Faculty Brown Bag Lunch Series, Oregon State University, February 20, 2009.• Layton, R.A., M.L. Loughry, M.W. Ohland, H.R. Pomeranz, “Resources for Student Teams: The Team-Maker and CATME systems (and why they work),” Academy of Process Educators Conference, Gaston College, July 9, 2009.• Ohland, M.W., “Tools for Teams,” invited workshop, Wichita State University, October 30, 2009.• Ohland, M.W., “Teams: creating a community of learning through peer accountability,” invited talk, November 20, 2009, Clemson University Environmental Engineering and Environmental Science.• Layton, R.A., M.L. Loughry, M.W. Ohland, “The Effective Management of Student Teams Using the CATME/Team-Maker System: Practice Informed by Research,” invited to Capstone Design Conference 2010, June 7-9, 2010: Boulder, CO.• Layton, R.A., M.L. Loughry, M.W. Ohland, “Research Into Practice: Tools for Effective Management of Student Teams,” workshop at American Society for Engineering Education 2010 Annual Conference.• Layton, R.A., M.L. Loughry, M.W. Ohland, H.R. Pomeranz, “Workshop on the Effective Management of Student Teams Using the CATME/Team-Maker System,” submitted to INGRoup (Interdisciplinary Network for Group Research) Conference, Arlington, VA, July 22-24, 2010.OTHER DISSEMINATION• Team-Maker / CATME flyers distributed at Mudd Design Workshop, May 2009, Claremont, CA, and the INGRoup Interdisicplinary Network for Group Research conference in Colorado Springs, CO, in July 2009.• Richard Layton will champion the development of presentation resources so that other members of the team can effectively promote the use of the system. Further, our “power users” – those who use the system frequently and who are very excited about using it – might be able to give presentations on behalf of the team (particularly to smaller groups of faculty.at their own institution).• Hal Pomeranz will investigate the possibility of user group conference at San Francisco State University. Depending on the success of such an event, there are active user communities at University of Southern Maine, Rose-Hulman Institute of Technology, Georgia Southern, and other sites. A multi-site EPICS conference might be possible.PUBLICATIONS PLANNEDA paper validating the results of the Team-Maker algorithm is nearing acceptance forpublication—we have heard that we may expect favorable word shortly. Because that paperfocused on the validation of the system, only a short section was devoted to which criteria aremost useful in team formation and why. A manuscript is under development that focusesexplicitly on this subject—a broad review of the literature on what criteria are used to formedteams, why, and in what contexts.In addition to this manuscript on best practices in team formation, papers are also plannedreviewing best practices in peer evaluation and separately on best practices in managing teams.This last paper will benefit from a 2006 meta-analysis on which Co-PI Salas was a co-author(Burke et. al 2006).Dave Woehr (Co-PI) has nearly completed a significant work validating the CATMEbehaviorally anchored rating scale. This is a three-study paper that provides firm evidence for theinstrument. Abbreviated results from these papers are provided in the Findings section. Datafrom a time-series experimental design at Rose-Hulman will be analyzed to determine the effectsof rater training. Data from Rose-Hulman and other schools at which students use CATME inmultiple courses over multiple semesters provide an opportunity to study the longitudinal effectsof using the CATME system.A manuscript is also planned publishing and analyzing the details of the Team-Maker algorithmdesign. A diverse literature has been published on strategies for team formation, but few of thosestrategies have achieved widespread implementation. This makes it possible to author a paperthat addresses not only the theoretical design of a team-formation algorithm, but also studies theresults of using that algorithm.REFERENCESBownas, D. & Bernardin, H. (1988). Critical Incident Technique. In Gael, S. (Hg.). The jobanalysis handbook for business, industry and government, 2. New York: Wiley, 1120-1137.Flanagan, J.C. (1954). The Critical Incident Technique. Psychological Bulletin, 51, 4, 327-358.Burke, C. S., Stagl, K. C., Klein, C., Goodwin, G. F., Salas, E., & Halpin, S. M. (2006). Whattypes of leadership behaviors are functional in teams? Leadership Quarterly, 17, 288–307.

Ohland, M. W., & Layton, R. A., & Ferguson, D. M., & Loughry, M. L., & Woehr, D. J., & Pomeranz, H. R. (2011, June), SMARTER Teamwork: System for Management, Assessment, Research, Training, Education, and Remediation for Teamwork Paper presented at 2011 ASEE Annual Conference & Exposition, Vancouver, BC. https://peer.asee.org/18978

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2011 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015