Asee peer logo

Invited Paper - SPARKPLUS : Enabling collaboration and dialogue for learning and developing standards

Download Paper |


2013 ASEE International Forum


Atlanta, Georgia

Publication Date

June 22, 2013

Start Date

June 22, 2013

End Date

June 22, 2013

Conference Session

Track 2 - Session I - Curriculum Development

Tagged Topic

Invited - Curriculum Development

Page Count


Page Numbers

21.48.1 - 21.48.10



Permanent URL

Download Count


Request a correction

Paper Authors


Keith Willey University of Technology Sydney

visit author page

KEITH WILLEY (BE 1st Hons and Medal, PhD) is a member of the Faculty of Engineering and Information Technology at the University of Technology, Sydney. He commenced his academic career after 20 years in the Broadcasting and Communications industry. In the area of education, Keith’s research interests include the learning and assessment associated with working in groups, the use of self and peer assessment for collaborative peer learning, the nature of informal learning in professional practice, flipped learning, academic standards, and improving peer review.
Keith is an Australian Learning and Teaching Council Fellow. He has received several awards including an Engineers Australia Engineering Excellence Award (Education and Training), the UTS Medal for Teaching and Research Integration and both the Australasian Association of Engineering Education (AaeE) Teaching Excellence and Research Design Awards.
Keith has been a visiting scholar at universities in Australia, Europe, North America and Asia. His commitment to developing high quality teaching and learning practices is supported by his educational research that has been published in numerous conference papers and journal articles. Keith is the Project Manager and lead developer of the self and peer assessment software tool known as SPARKPLUS. This software is currently being used by faculty at over 20 Australian Universities and several Universities and High Schools in Europe Asia and North and South America.

visit author page


Anne P Gardner University of Technology, Sydney

visit author page

ANNE GARDNER is a member of the Faculty of Engineering and Information Technology at the University of Technology, Sydney (UTS). Anne’s research is in engineering education where she works with Dr. Willey in improving understanding of the learning associated with and assessment of collaborative learning, workplace learning by professional engineers, and improving the peer review process for engineering education publications. Anne also contributes to the development of the software tool SPARKPLUS.
Anne has received recognition for her work in educational research and development including an Engineers Australia Engineering Excellence Award, and Australasian Association of Engineering Education (AaeE) Teaching Excellence Award and Research Design Award. Anne is currently a UTS Learning 2014 Fellow, a role requiring leadership in demonstrating and disseminating innovative teaching and learning practices throughout the university.

visit author page

Download Paper |


SPARKPLUS : enabling collaboration and dialogue for learning and developing standards.AbstractProfessional learning is often informal, learnt on the job through engaging in practice withpeers. Hence, to prepare students for professional practice they require opportunities todevelop their ability to work in such collaborative /socially constructed learningenvironments.The authors have conducted several studies investigating the impact of collaborative learningactivities on the people that participate in them. We found thoughtful design is required,including scaffolding, to motivate desired approaches and attitudes to learning. The results ofthese studies informed the development of a collaborative learning activity framework andthe educational technology tool SPARKPLUS. In this paper we use exemplar activities todescribe the findings of these studies and demonstrate both the framework and the supportprovided by SPARKPLUS.IntroductionThere is an expectation by organisations that the professionals they employ, includingengineers, engage in ongoing learning in order to meet the demands of continuing change.Much of this learning is informal, learnt on the job through practice with peers.Recent writers on workplace learning 1, 2 argue that many traditional assumptions aboutprofessional learning are problematic in that learning has often been seen as something thatindividuals do, for example attending a course. This simplistic view fails to consider how thesocial dimensions of work provide a rich context for professional learning. More specifically,some of these studies show that the work is not only a context, or backdrop, but isfundamentally implicated in learning 3, 4, 5. Hence, to prepare students for professionalpractice they require opportunities to practise, experience, reflect and improve their ability towork in collaborative /socially constructed learning environments.In an educational context, collaboration is generally described as an approach involving jointintellectual efforts between students, or between students and the instructor 6. Dana 7 reportsthat compared to traditional competitive or individualistic learning environments, benefits ofcollaborative tasks include higher student achievement, greater use of higher level reasoningand critical thinking skills, more positive attitudes toward the subject matter and satisfactionwith the class. However the benefits are not automatic. Thoughtful design, assessmentscaffolding and the support of educational technology, particularly in large classes, contributeto both their success and sustainability.In this paper we use exemplar activities to describe the findings of several studies thatinformed the development of a framework and educational technology to support learningthrough collaboration and dialogue and the development of professional judgement.BackgroundThe authors have conducted several studies investigating the impact of technology assistedcollaborative learning activities 8-12. Our findings highlighted the need to develop activitiesthat cultivate students‟ judgement, facilitate peer feedback, promote learner independence,and reinforce development of their professional engineering identity.Our aim is to promote a learning focus as opposed to a task-focused disposition in students.A student‟s core identity may be such that they resist this change in focus limiting theirengagement with these activities. We found scaffolding to be valuable to motivate desiredapproaches, behavior and attitudes to learning.For example, we constantly remind studentsthat “mistakes compress learning” and to benefit most from any activity they should bepushing their learning boundaries until they make mistakes and / or discover what they do notknow 13. In addition, we found that in summative activities students, with some justification,tend to strategically focus on how to achieve the best mark. Conversely, formativecollaborative activities provide a low-risk environment 14 allowing students to push theirlearning boundaries, make mistakes, identify gaps in their learning and have these addressedby their peers. However, we acknowledge the need to develop quality scaffolding to motivatethe participation of some students in formative activities. Furthermore we suggest thatwithout the assistance of educational technology the administrative burden and cost to providethese type of collaborative activities, especially in large classes, is likely to be unsustainable.Analysis of the results of these studies informed the collaborative learning activity framework(Figure 1) and the educational technology tool SPARKPLUS 15that are previewed in theremainder of this paper. Figure 1 Collaborative Learning Activity FrameworkFrameworkThe first step in any collaborative learning activity should be an individual task, usuallyundertaken out of class, allowing participants to identify gaps in their own learning /understanding. This is followed by a group task where participants have their learning gapsaddressed by their peers while completing the activity collaboratively. The dialogue withinthis task not only provides the social dimensions important to learning but provides adiscourse to challenge participants‟ understanding and judgement, convert tacitunderstandings to explicit explanations and socially construct meaning, language andstandards. Breakout groups are then brought back together for the facilitator to clarify anyoutstanding issues. To discern a difference, learn and develop judgement, one must haveexperienced a variation from previous experience 16. Hence, we recommend that nextinstructors vary an aspect of the activity to change the outcome and have participantscomplete this first individually then collaboratively to verify their understanding. Finally, aconfirmation task that applies the learning in a new context and/or a more complex situationshould be undertaken individually to confirm understanding and reduce the occurrence of“collective ability” 17 - 20 (where as members of a team participants appear to understand theactivity learning outcomes, but are unable to demonstrate this learning individually); thencollaboratively as part of the next repeated cycle.Both students and academics are active participants in the various stages of the framework,albeit with differing roles. The following sections describe how the framework was applied totwo different activities: one where the participants were academics, and in the other, students.Tutor BenchmarkingThe motivation for this research is the international trend to focus on learning orientedassessment activities and demonstrated learning outcomes. While a key factor in theseactivities is the provision of feedback to students, the capacity of academics to providequality judgements and feedback is often taken for granted. Without appropriate consensusaround the meaning and understanding of academic standards there is no assurance thatassessment standards and practices are valid and/or reliable. Furthermore, if academics don'tunderstand or can't articulate the standards they are assessing, how can they provide studentswith quality learning oriented feedback on their work? In several studies 21 – 23 we foundsignificant benefits in implementing the framework (Figure 1) using SPARKPLUS to co-construct understandings of academic standards amongst instructors and tutors as describedbelow.Application of Framework for Tutor Benchmarking using SPARKPLUSPrior to Assessment meetingIndividual assessment: assessors/tutors are provided with a copy of two pieces of work tograde against specified criteria, entering their assessment (grades, reasoning and feedbackcomments) into the multiple assessor tool in SPARKPLUS.During Assessor/Tutor MeetingCollaborative discussion and assessment: tutors logon to SPARKPLUS or are provided with aprint out of the SPARKPLUS results (Figure 2) to compare their grading and comments to thatof the other tutors (displayed anonymously) and reflect on any differences. Tutors areformed into small groups to discuss their individual grading and subsequently tocollaboratively, reaching a consensus, re-grade and provide feedback comments on eachreport.Facilitator led discussion: a facilitator, often the course co-ordinator, leads a discussionfocusing on exploring any differences in grading and/or reasoning that have emerged in thesmall group discussions.Vary activity: To assist participants to clarify and reflect on their judgement we recommendthat the facilitator vary an aspect of the activity to change the outcome; for example, bymodifying an assessment criterion or task objective.Let‟s assume that the original task was to produce a five-minute video to teach a professionalaudience cardiopulmonary resuscitation (CPR) and an assessment criterion asks tutors toevaluate the video‟s capacity to engage the target audience. The introduced variation could beto change the target audience to teenagers. The cycle is now repeated to highlight tutors‟grading sensitivities and broaden their understanding, with them reassessing the workconsidering the varied task objective, first individually then collaboratively.Finally, a confirmation task is undertaken where tutors assess an additional piece of work toconfirm their understanding and capacity to articulate the reasons for their assessments.Again this confirmation task is undertaken first individually, then collaboratively, followedby a discussion exploring any outstanding differences in grading and/or reasoning.Impact/DiscussionImplementing the framework using the software tool proved to be an efficient and effectiveprocess to:  socially construct tutors‟ understanding of assessment criteria including agreeing on the factors to consider and their relative importance when assessing against the criteria,  benchmark their judgement and reasoning against other tutors and instructors,  develop a shared descriptive language to improve feedback comprehension, and  assist tutors to explicitly articulate their tacit judgements allowing them to improve feedback to studentsTutors were overwhelmingly positive about the impact of this collaborative activity providingcomments such as: “It validated my understanding of the subject and fine-tuned a fewconcepts”, and “It was a fast way to get issues discussed and resolve differences“.Figure 2: Results screen for tutor benchmarking activity showing how the course co-ordinator (triangle on top side of slider) and the tutors (triangles on bottom side ofslider) rated the work including any comments they made (to save space a number ofcriteria have been removed from this screenshot).In regard to the software‟s facilitation tutors reported that the screens (e.g. results screen inFigure 2) made it easy for them to observe where their opinions and reasoning differed fromthe other tutors and where as a group there was the most agreement and disagreement.Tutors also found that the comment summary screen (Figure 3) helped them to understand thereasons for grading differences and highlighted issues to discuss in the collaborativedialogue: “I was able to see what they were thinking and learn and improve my own[feedback]...”. While the differences in tutor perspectives were initially exposed bycomparing their individual ratings in SPARKPLUS, it was in the subsequent collaborativedialogue where these differences were explored and discussed that the standards to be used ingrading were co-constructed. Because comments are anonymised, each comment isdiscussed on its merits free from the bias that may result from dominant personalities orperceived differences in expertise. Interestingly it was sometimes the tutor whose ratingdiffered the most from their peers who raised an issue others had not previously considered,but after discussion all tutors agreed was important.In addition, tutors reported that observing the differences and ambiguity in the language eachused (to explain their reasoning on the same report), helped them appreciate our previousfindings that discrepancies and ambiguities in feedback language significantly contribute tostudents‟ perception that grading is unfair, commenting that: “I can see consistency acrossthe tutors is important”, and “I can see the potential for frustration by the students”.Exploring these differences through discussion not only required tutors to explicitly articulatewhat were often previously tacit judgements but also to co-construct a language to describereports of different standard allowing them to provide more specific feedback to students.Figure 3: The comment summary screen allows all comments to be displayed at once or only for a preselected rating.Student Collaborative LearningThe learning benefits of combining the framework and SPARKPLUS have also been applied tostudents in a range of activities including the „flipped‟ activity described below.Pre-class, self and peer assessment formative learning activities are designed to beundertaken at the start of each subject topic, enabling students to assess their level ofunderstanding before in-class lectures. The intention is that class time is spent on highercognitive interactive activities, focusing on material that most students „don‟t get‟, rather thanon material that students can understand by themselves. Prior to class: 1. Students undertake readings, research and/or development activity. 2. Students answer and provide their reasoning for a series of online multiple-choice questions facilitated via SPARKPLUS (Figure 4). 3. a) students can log on and use the SPARKPLUS summary screens showing histograms and confidential comments submitted for each answer choice (A, B C etc) to compare their answers (Figure 5) and reasoning to their peers (similar to Figure 3). Note: Having observed that many students see the instructor‟s answers and explanations as providing closure, being all they need to know, answers are not published until after class to encourage students to make comparisons with their peers and think for themselves. b) academics also use these screens to identify topics that are troubling students and/or any regular misconceptions. Figure 4: Example question screen in SPARKPLUS. Figure 5: Individual student’s result screen. Note the yellow histogram indicates this student’s answer.During ClassClasses begin with a general discussion of the pre-work activity exploring the associatedmaterial in more detail. During this time the instructor, guided by the results (typicallydisplayed in class as shown in Figures 5 and 3) and in collaboration with the students,addresses any common misconceptions or misunderstandings after which more complexmaterial is explored. This activity is often repeated in class for the more complex material,again first individually then collaboratively using either SPARKPLUS or IFAT cards 24.Impact/DiscussionThese pre-lecture activities have been used as the initial step in the collaborative learningframework (Figure 1) in a range of classes both locally and in our offshore teaching programin Hong Kong. Students reported that it gave them an opportunity to check theirunderstanding and learn from comparing their answers and reasoning to other students in thesafety provided by the anonymised reporting in the software. Other students reported usingthe activities before class as a guide to what they were expected to learn and after class toevaluate their understanding. In attempting questions students regularly described a tendencyto answer a question without fully and/or carefully reading it, usually leading to the choice ofan incorrect answer. This proved to have a positive outcome with students commenting theybecame more careful in reading questions particularly in summative activities.Instructors reported that the summary screens (Figure 5) made it easier for them to identifyareas of the subject that students were having trouble understanding. In addition, being ableto click on the slider and view the students‟ comments explaining their reasoning for thedifferent answer choices gave them insights into common misconceptions, especially in caseswhen students were getting the right answer for the wrong reason. Academics also found thisfeature useful to display the range of answers and explanations in class and hence facilitatelearning oriented discussion. The formative nature of these activities allows instructors toprovide innovative variations and students the freedom to focus on learning rather thanmaximising marks. For example we found vigorous debate and enhance discussions oftenresulted from providing questions with two correct answer choices expressed in differentways or with no correct answer choices provided where students were expecting a singlecorrect answer.A number of students reported that they found “…it too difficult to answer the questionsbefore the content was taught in lectures...”. This response was most common amongststudents undertaking their first year of study, and is not surprising given that there may havebeen few opportunities during their school years to practice self-directed learning. However,with a generally agreed aim in higher education to develop independent learners we arguethat the best time to start developing these skills is first year. We do however recognise thedifficulties that such „flipped‟ classroom activities can provide students and in futuresemesters we intend to support these activities with short introductory videos on each topic.Somewhat surprisingly given the high use of technology by most students a small minorityreported a dislike for online learning activities: “I like to work from books and past tests orexams...I don‟t really like using computers”, and “...I prefer more traditional methods”,reminding us of the need to provide inclusive alternatives.Future DirectionsWe are currently undertaking a trial using the student activity described above to providelearning opportunities within a MOOC. Additional measures have been undertaken toimprove the implementation of this activity in a MOOC environment including enabling real-time online collaboration. We also intend to pay more attention to the analytics collected bythe software to see if they suggest any issues to be investigated related to student learning; forexample, exploring any relationship between learning and how often students logon and/orthe amount of interaction when comparing their answers.More recently the authors are investigating how activity scaffolding can be augmented byidentity theory. A relatively under-researched area of engineering education is the impact ofuniversity-based learning experiences on the development of the personal and professionalidentities of our students. Some research, such as that by Tonso25, McNair et al26 , Eliot andTurns27 and Bennet28, suggests that student‟s personal identity impacts the way they engagewith learning opportunities and that their identity as an emerging engineer can be reinforcedby designing activities that encourage students to practice the thinking and language of thediscipline.ConclusionsActivities designed using the collaborative learning framework assisted instructors to co-construct standards improving their judgement, grading, articulation and quality of feedback.Learning benefits were also apparent in using the framework to design activities for studentswith iterations of individual and collaborative work. The educational tool SPARKPLUSproved to be an efficient and effective tool to facilitate these activities particularly in largeclasses.Bibliograhpy1. Hager, P. & Hodkinson, P. (2009) Moving beyond the metaphor of transfer of learning. British Educational Research Journal, 35 (4), 619–638.2. Fenwick T.(2009) Making to measure? Reconsidering assessment in professional continuing education, Studies in Continuing Education, vol. 31, pp. 229-244.3. Billett, S. (2001) Learning through work: Workplace affordances and individual engagement. Journal of Workplace Learning, 13 (5), 209–214.4. Billett, S. (2004) Workplace participatory practices: Conceptualising workplaces as learning environments. The Journal of Workplace Learning, 16 (6), 312–324.5. Skule, S. & Reichborn, A. (2002) Learning-conducive work: A survey of learning conditions in Norwegian workplaces. Luxembourg: CEDEFOP.6. Smith B. L., & MacGregor, J. T.,(1992) What is collaborative learning?, in Collaborative learning: A sourcebook for higher education., M. R. M. A. S. Goodsell, V. Tinto Ed.: University Park, PA: National Center on Postsecondary Teaching, Learning, & Assessment.7. Dana S. (2007), Implementing Team-Based Learning in an Introduction to Law Course., Journal of Legal Studies Education, vol. 24, pp. 59-108.8. Willey, K and Gardner, A. (2008) Using Self Assessment to Integrate Graduate Attribute Development with Discipline Content Delivery. Proceedings of the 36th Annual Conference of the European Association of Engineering Education (SEFI) Quality Assessment, Employability and Innovation 2-5 July, Aalborg, Denmark.9. Willey K. & Gardner A., (2009) Investigating the capacity of Self and Peer Assessment to Engage Students and Increase their Desire to Learn. Proceedings of the 37th Annual conference of the European Association of Engineering Education (SEFI) Attracting Student in Engineering - Engineering is Fun 1-4 July. Rotterdam, the Netherlands.10. Gardner A. & Willey K.,( 2010) Critical conversations: how collaborative learning activities can prepare students for Structural Engineering practice. Proceedings of the 21st Annual Conference of the Australasian Assoc. For Engineering Education, 5-8 December. Sydney, Australia.11. Willey & Gardner, (2010) Investigating the potential of self and peer assessment in developing learning oriented assessment tasks. Proceedings of the 20th Annual Conference of the Australasian Assoc. For Engineering Education. Adelaide, Australia.12. Willey K. & Gardner A. (2010), Collaborative Peer Learning to Change Learning Culture and Develop the Skills for Lifelong Professional Practice, in Proceedings of the 21st Annual Conference for the Australasian Association for Engineering Education, Sydney, Australia, pp. 222 - 229.13. Svinicki M. (2004), Learning and Motivation in the Postsecondary Classroom: Anker Publishing.14. Irons A. (2008) , Enhancing Learning through Formative Assessment andFeedback: Routledge, 2008.15. Willey K., "SPARKPLUS,", 2008.16. Runesson, U. (1999), Teaching as constituting a space of variation, in 8th European Association for Research on Learning and Instruction (EARLI) Conference, N. Mercer, Ed. Goteborg, Sweden.17. Willey K. & Gardner A.,(2011) Change Learning Culture with Collaboration, in Proceedings of the 2011 SEFI Annual Conference:Global Engineering Recognition, Sustainability, Mobility, Lisbon, Portugal., pp. 93 - 98.18. Willey K & Gardner, A., (2011) Want to change learning culture: provide the opportunity, in Proceedings of Research in Engineering Education Symposium Madrid, Spain, 2011, pp. 259 - 267.19. Gardner A. & Willey, K., (2011) Investigating the characteristics of successful collaborative learning activities in Proceedings of Research in Engineering Education Symposium, Madrid, Spain, 2011, pp. 332 - 339.20. Willey K. & Gardner, A., (2012) Collaborative Learning Frameworks to Promote a Positive Learning Culture in Proceedings of the 42nd ASEE/IEEE Frontiers in Education Conference Seattle, Washington, pp. 638 – 643.21. Willey, K. & Gardner, A.. (2010a) Perceived Differences in Tutor Grading in Large Classes: Fact or Fiction? Proceedings of the 40th ASEE/IEEE Frontiers in Education Conference. Virginia, USA, 27 – 30 October.22. Willey, K & Gardner A. (2010b) Improving the standard and consistency of multi-tutor grading in large classes, Assessment: Sustainability, Diversity and Innovation, Proceedings of the ATN Assessment Conference, The University of Technology, Sydney, 18 & 19 November.23. Willey K. & Gardner A (2011), Getting tutors on the same page, in Proceedings of the 2011 Australasian Association of Engineering Education (AAEE) Conference Fremantle, Western Australia.24. Epstein Educational Enterprises last viewed 26th May 201225. Tonso, K. (2006), Student engineers and engineer identity: campus engineer identities as figured world. Cultural Studies of Science Education vol. 1, 273 – 307.26. McNair, L., Newswander C., Boden, D. & Borrego M. (2011) Student and faculty interdisciplinary identities in self-managed teams. Journal of Engineering Education, 100 (2), 374 – 396.27. Eliot, M. & Turns, J. (2011) Constructing professional portfolios: Sense-making and professionalidentity development for engineering undergraduates. Journal of Engineering Education,100(4), 630-654.28. Freer P. & Bennet D. (2012) Developing musical and educational identities in university music students. Music Education Research. 14(3). 265 – 284.

Willey, K., & Gardner, A. P. (2013, June), Invited Paper - SPARKPLUS : Enabling collaboration and dialogue for learning and developing standards Paper presented at 2013 ASEE International Forum, Atlanta, Georgia. 10.18260/1-2--17253

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015