Asee peer logo

Invited Paper - Improving First-year Engineering Education Using the Engineers Without Borders Australia Challenge: what worked for whom under what circumstances

Download Paper |

Conference

2013 ASEE International Forum

Location

Atlanta, Georgia

Publication Date

June 22, 2013

Start Date

June 22, 2013

End Date

June 22, 2013

Conference Session

Track 1 - Session I - Student Development

Tagged Topic

Invited - Student Development

Page Count

10

Page Numbers

21.43.1 - 21.43.10

DOI

10.18260/1-2--17248

Permanent URL

https://peer.asee.org/17248

Download Count

407

Request a correction

Paper Authors

biography

Lyn Brodie University of Southern Queensland

visit author page

Lyn Brodie is an Associate Professor in the Faculty of Engineering and Surveying at the University of Southern Queensland. Her research interests include engineering education, Problem Based Learning, assessment and the first year experience. She is a board and founding member of the USQ Teaching Academy and Director of the Faculty Engineering Education Research Group. Lyn was the academic team leader for the teaching team which successfully designed a strand of PBL courses for the faculty. Her work has been recognised through several awards including a University Award for Design and Delivery of Teaching Materials, Carrick Institute Citation and Australian University Teaching Award for Innovation in Curricula Learning and Teaching, USQ Associate Learning and Teaching Fellowships for curriculum and assessment development and recognition from the Australian Association of Engineering Educators for innovation in curricula. On several occasions Lyn has been a visiting Professor to the University of Hong Kong – Centre for Advancement of University Teaching, consulting in both PBL and online curriculum development and assessment. She is the 2013 president for the Australasian Association for Engineering Education (AAEE).

visit author page

biography

Lesley Jolly Strategic Partnerships

visit author page

My original work as an anthropologist was with Australian indigenous peoples but in 1996 I was approached to undertake an ethnography of the first-year engineering class at the University of Queensland with a view to understanding the gender dynamics there. Since then my association with engineers and engineering has grown to dominate my research life. I have continued to pursue my contact with engineers through a variety of research projects, the supervision of PhD students in engineering problems that have social dimensions and by establishing and leading the new Research Methods Interest Group of the Australasian Association for Engineering Education (AAEE). In that capacity I have run workshops on research methods and educational evaluation in Australia and New Zealand and was a founder leader of the annual AAEE Winter School for engineering education research. In the last two years I have completed two CRC projects; Evaluation of Simulators in Train Driver Training and Towards a National Framework for Competence Assurance for Train drivers. I have also recently managed an ALTC project called Curriculum Change through Theory-Driven Evaluation on behalf of the University of Queensland.

visit author page

biography

Caroline Crosthwaite University of Queensland

visit author page

Caroline Crosthwaite BE(Hons), MEngSt (UQ), MSc. (JCU). Caroline is Associate Dean (academic), Faculty of Engineering, Architecture & Information Technology at the University of Queensland (UQ), Australia where she oversees teaching and learning, international partnerships and pathways, and academic and student administration in coursework programs.
Caroline is a chemical engineer and has worked at a number of Australian Universities where she has been led curriculum design and the development of student centred, active learning practices. Caroline coordinated UQ’s chemical engineering curriculum team that implemented the internationally recognised Project-Centred Curriculum in Chemical Engineering which in 2005 won the Australian Award for University Teaching for Enhancing the Quality of Teaching and Learning. Caroline’s engineering education work has also been recognised with an Australasian Association for Engineering Education Award for Excellence in Curriculum Innovation (2003) and the UQ Vice Chancellor’s Award for Internationalisation (2010). Caroline has also been involved in the development of international partnerships including the first Australian - French and Australian - German double degrees in engineering.
She has just completed a national project looking at the use of Engineers without Borders projects (EWB Design Challenge) in the engineering curriculum in thirteen different Australasian universities. Caroline has also been involved in previous teaching and learning projects developing immersive learning environments using virtual reality, and supporting and assessing students in project teams.
Caroline consults nationally and internationally on engineering education. She has worked with Imperial College London, Technical University of Denmark, Purdue University, Institution of Chemical Engineers (Australia), and (Malaysia).

visit author page

biography

Lydia Kavanaugh University of Queensland

visit author page

Associate Professor Lydia Kavanagh is an innovative, enthusiastic and dedicated teacher and mentor who brings to her discipline a wealth of professional engineering experience. Since returning to academia in 1998 after working for 13 years in industry, she has become a leader in engineering education in Australia and has used her background as a professional engineer to design both curricula and courses for active learning by combining real-world projects and specialist knowledge. As UQ's Director of First Year Engineering, Lydia has inspired students to develop the knowledge, confidence and capabilities essential for success in the engineering profession in the 21st Century.
She is dedicated to ensuring that they not only learn about engineering but also how to be engineers. In 2011, she won a national award for Excellence in Teaching, and in 2012, her first year engineering courses received a commendation from Engineers Australia.

visit author page

Download Paper |

Abstract

Paper ID #8369Invited Paper - Improving First-year Engineering Education Using the Engi-neers Without Borders Australia Challenge: what worked for whom underwhat circumstancesMs. Lyn Brodie, University of Southern Queensland Lyn Brodie is an Associate Professor in the Faculty of Engineering and Surveying at the University of Southern Queensland. Her research interests include engineering education, Problem Based Learning, assessment and the first year experience. She is a board and founding member of the USQ Teaching Academy and Director of the Faculty Engineering Education Research Group. Lyn was the academic team leader for the teaching team which successfully designed a strand of PBL courses for the faculty. Her work has been recognised through several awards including a University Award for Design and Delivery of Teaching Materials, Carrick Institute Citation and Australian University Teaching Award for Innovation in Curricula Learning and Teaching, USQ Associate Learning and Teaching Fellowships for curriculum and assessment development and recognition from the Australian Association of Engineering Educators for innovation in curricula. On several occasions Lyn has been a visiting Professor to the University of Hong Kong – Centre for Advancement of University Teaching, consulting in both PBL and online curriculum development and assessment. She is the 2013 president for the Australasian Association for Engineering Education (AAEE).Lesley Jolly, Strategic Partnerships My original work as an anthropologist was with Australian indigenous peoples but in 1996 I was ap- proached to undertake an ethnography of the first-year engineering class at the University of Queensland with a view to understanding the gender dynamics there. Since then my association with engineers and engineering has grown to dominate my research life. I have continued to pursue my contact with en- gineers through a variety of research projects, the supervision of PhD students in engineering problems that have social dimensions and by establishing and leading the new Research Methods Interest Group of the Australasian Association for Engineering Education (AAEE). In that capacity I have run work- shops on research methods and educational evaluation in Australia and New Zealand and was a founder leader of the annual AAEE Winter School for engineering education research. In the last two years I have completed two CRC projects; Evaluation of Simulators in Train Driver Training and Towards a National Framework for Competence Assurance for Train drivers. I have also recently managed an ALTC project called Curriculum Change through Theory-Driven Evaluation on behalf of the University of Queensland.Caroline Crosthwaite, University of Queensland Caroline Crosthwaite BE(Hons), MEngSt (UQ), MSc. (JCU). Caroline is Associate Dean (academic), Faculty of Engineering, Architecture & Information Technology at the University of Queensland (UQ), Australia where she oversees teaching and learning, international partnerships and pathways, and aca- demic and student administration in coursework programs. Caroline is a chemical engineer and has worked at a number of Australian Universities where she has been led curriculum design and the de- velopment of student centred, active learning practices. Caroline coordinated UQ’s chemical engineering curriculum team that implemented the internationally recognised Project-Centred Curriculum in Chemical Engineering which in 2005 won the Australian Award for University Teaching for Enhancing the Quality of Teaching and Learning. Caroline’s engineering education work has also been recognised with an Aus- tralasian Association for Engineering Education Award for Excellence in Curriculum Innovation (2003) and the UQ Vice Chancellor’s Award for Internationalisation (2010). Caroline has also been involved in the development of international partnerships including the first Australian - French and Australian - German double degrees in engineering. She has just completed a national project looking at the use of Engineers without Borders projects (EWB Design Challenge) in the engineering curriculum in thirteen different Australasian universities. Caroline has also been involved in previous teaching and learning projects developing immersive learning environments using virtual reality, and supporting and assessing c American Society for Engineering Education, 2013 Paper ID #8369 students in project teams. Caroline consults nationally and internationally on engineering education. She has worked with Imperial College London, Technical University of Denmark, Purdue University, Institu- tion of Chemical Engineers (Australia), and (Malaysia).Lydia Kavanaugh, University of Queensland Associate Professor Lydia Kavanagh is an innovative, enthusiastic and dedicated teacher and mentor who brings to her discipline a wealth of professional engineering experience. Since returning to academia in 1998 after working for 13 years in industry, she has become a leader in engineering education in Australia and has used her background as a professional engineer to design both curricula and courses for active learning by combining real-world projects and specialist knowledge. As UQ’s Director of First Year Engineering, Lydia has inspired students to develop the knowledge, confidence and capabilities essential for success in the engineering profession in the 21st Century. She is dedicated to ensuring that they not only learn about engineering but also how to be engineers. In 2011, she won a national award for Excellence in Teaching, and in 2012, her first year engineering courses received a commendation from Engineers Australia. c American Society for Engineering Education, 2013Improving First-year Engineering Education Using the Engineers WithoutBorders Australia Challenge: what worked for whom under whatcircumstances.IntroductionReviews of engineering and engineering education around the world 1,2,3,4 have called forengineers to rise to the challenge of a global environment characterised by rapid social,environmental and technological change. However, despite changes to curriculum intended topursue such goals the most recent review of engineering education in Australia5 notes that“further curriculum changes and developments will be essential to maintain student numbers andmeet students’ expectations satisfy employers [sic] and the profession at large” (p.59). The reportgoes on to note that “current engineering curricula [around the world] do not deal well with thedifficult topics of uncertainty, integration and complex systems” (p.62) and multidisciplinaryapproaches (p.72). Other areas of the curriculum in need of change include the need for activelearning, inclusivity, relevance to current engineering practice and better integrated projectmanagement instruction5.One response to such demands has been the adoption of the Engineers Without Borders Challenge(www.ewb.org.au/ewbchallenge) as the basis for first year team projects in most engineeringfaculties in the country. Every year, EWB nominates one of their partner organisations in adeveloping community, with a range of projects and themes addressing needs and work in thatcommunity, as the basis for the year’s EWB Design Challenge. EWB develops and provides asuite of resources including on-line information about the community and the partnerorganisation’s work. As of 2011, over 18,000 students at thirty-one universities in Australia andNew Zealand had participated in the EWB Challenge. The nature of the projects provides theopportunity to expose students to the complexity and specificity of real-world projects wheresustainability and usability are important factors and teamwork is required to manage the projects.All of the universities involved have implemented this innovation differently and comparison ofthese different implementations afforded the opportunity to assemble3 “a body of carefullygathered data that provides evidence of which approaches work for which students in whichlearning environments” (p.26) in pursuit of the desired outcomes.The attribution problemCurriculum innovation and change is more often driven by external factors such as changes in thefield or informal feedback from students and staff rather than systematic data collection3,6 . Apreliminary study of the EWB Challenge at The University of Queensland7 indicated that at thevery least students felt that they had been motivated by the experience, learned how to perform inteams and appreciated the wider role of engineering in society as a result.However, an unanswered question remained as to whether this or any course innovation isresponsible for the outcomes attributed to it. Students are doing much more than just their projectwork during the semester and there might be many possible causes for any given outcome. Inaddition there is commonly a range of responses in the student body to any innovation. Thiscreates a problem for staff who want to build on the perceived benefits of an innovation such asthe EWB Challenge, since it is not clear what outcomes can be attributed to what mechanisms andunder what circumstances. For instance, the experiential learning attributed to teamwork on reallife projects such as this can be more assumed than proven. In this project we were interested toidentify which aspects of context had an influence on the way students went about learning andhow those choices affected learning outcomes.MethodologyData about how the intervention was being implemented was collected at 13 universities inAustralia and New Zealand over Semesters 1 and 2 of 2011. We began data collection with aseries of program logic analyses8 with each course controller and, where available, their teachingteam. The program logic analysis (Table 1) is a method for uncovering the underlying logic ofwhat is involved in a given intervention and how it is understood to produce the desired outputs,outcomes and impacts. Our method is based on the standard Wisconsin model9, 10 and allowed usto gain an understanding of the variety of approaches and intended outcomes across the 13 sites. Italso helped identify where there was divergent thinking within a team and places whereunintended consequences might be expected. At this stage we also identified useful sources ofdata and time lines and processes for the data collection. Table 1 Program Logic MatrixData collection started in semester 1, 2011 and included document analysis, observation ofclasses, focus groups and interviews and a student exit survey. The analytic framework used fordealing with this vast body of data was drawn from the Realist Evaluation work of Pawson andTilley11. This approach to evaluation is based on the premise that it is not enough to ask whetheran intervention works or not. What works in one place may not work equally well under differentcircumstances, and various factors will account for the different outcomes. In order to be able toelicit the kind of understanding that will allow us to generalise our findings we must identify whatfactors in the context make a difference, and the range of possible responses to the intervention.Pawson and Tilley express this as a formula: C + M = O,where C stands for “contexts” (understood as the sociocultural conditions that set limits on theefficacy of the intervention), M stands for “mechanisms” (the decisions to change that aretriggered by the intervention) and O stands for “outcomes” (which may be unintended as well asintended). Data analysis was qualitative rather than quantitative, seeking to identify the contextsand mechanisms with most significant impact rather than those that occurred most frequently.Data was managed in the software program NVivo10 and analysed using constant comparativemethod.Broad level findingsIt is impossible in a paper of this length to do justice to all of the findings of a very large project.Instead we will explain briefly the major clusters of context and mechanism factors and pick outone or two to discuss in more detail.Our analysis was a “grounded” one in that we searched the data for recurrent patterns ofcontextual influence and mechanism leading to observed outcomes. The analysis of Contexts, forinstance, concentrates, on how best to understand the factors affecting outcomes rather thanworking with pre-conceived notions of what may be significant in the context (such as, forinstance, “online”). C2 Alignment of M3 Desire to Improve Assessment with Work Practices Learning Goals CLUSTERS CATEGORIES Enabling: Supporting:  The “will it work in the village”  The “editing each other’s context work” mechanism  The “correct assessment target” context Inhibiting: Disabling:  The “divide and conquer” mechanism  The “it’s all about cost” context  The “hero leader”  The “will it be on the exam” mechanism context  The “head nodding”  The “we don’t know what mechanism they’re looking for” context Figure 1: examples of cluster and category level factorsWe drew on Sochacka’s12 two-level analytic model (Figure 1 is an excerpt from the wholeanalysis) in which coded data with similar underlying features were grouped together first intocategories which had either positive or negative effects. These categories embodied the range ofphenomena present in our data and were labeled with in vivo codes for clarity and immediacy. Ata higher level, categories were grouped again into abstract analytic clusters which attempt tocapture the general principle at work. At this level we would expect the cluster factors to begeneralisable across different situations and so it is no surprise to find they embody some verygeneral pedagogic principles.ContextsContexts are the sociocultural conditions that determine what outcomes can be achieved and therelevant contexts were not only those which appeared frequently in the data but also those wherewe had evidence of significant impact. We found that there were five relevant contextual factors:stakeholder commitment, alignment of assessment with learning goals, a focus on the conditionsof use of the design, teachers embodying course goals in their practice, and the fact of the projectsbeing “real world” (Table 2). Table 2: Contexts of significance at the cluster level Cluster Description C1 Commitment of This cluster is concerned with the broadest institutional aspects of Stakeholders to implementing the projects including factors affecting status, purpose Learning Goals and perceptions of the course within its program context. C2 Alignment of This cluster includes the degree to which assessment activities and Assessment with criteria actually address the desired outcomes and the clarity with Learning Goals which assessment requirements are communicated and understood. C3 Focus on The degree to which teachers and course designs concentrate on Conditions of Use of either technical concerns or end-user concerns, creates a set of socio- Design cultural conditions that affect what tasks will be pursued and therefore what attributes will be developed. C4 Teachers This cluster includes a range of observed approaches to the task of Operationalise leading students through the learning to attain overall objectives. The Course Aims kinds of mastery demonstrated by teachers were likely to influence how and what students set about learning. C5 Use of Real Understandings of the projects as work in the real world for real World Projects clients has an effect on both student and teacher approaches to the task and how well the objectives are realised.The context we have chosen to illustrate our analysis of context factors is one of the categoriesfrom C2, which could be understood as equivalent to the well-known principle of constructivealignment13,. We labeled this the “correct assessment target” context (Figure 1). While many ofthe staff we interviewed identified non-technical skills such as communication and the need forsustainable design as desirable learning outcomes, in practice assessment most commonly centredon outputs such as written reports and oral presentations. Where the focus was on the outputs,conditions were created that prompted students to adopt the “divide and conquer” mechanismwhich we will describe in the next section.There were two approaches to assessment design which provided better learning outcomes; onethrough the use of portfolio assessment and one through a Demo Day where students had todemonstrate and justify their models/prototypes in public, rather than just talk about them in class.The course using portfolio assessment listed learning objectives that looked very similar toeveryone else’s, but instead of expecting targets such as communication to be embedded in awritten report, students were required to keep detailed records of their work over the semester andto use those records and their final report to argue for the extent to which they achieved theoutcomes.(Figure 2). Figure 2: Example of portfolio assessment guidelines.Although no marks were received for the project work in this course, students were enthusiasticabout this mode of assessment and felt it was relevant to their future professional lives.The course using the Demo Day as a major part of their assessment, allowed their students tochoose one of four different projects, only one of which was the EWB Challenge. However, allfour were constructed to allow for an emphasis on “engineering in practice” and required studentsto consider how contexts of use and user needs affect design. The students had to build amodel/prototype of their design and demonstrate it in a public place on campus where they couldbe asked questions by invited industry professionals, staff and senior students and interestedpassers-by. Class discussions within student teams were more than usually collaborative andinclined to try a range of innovative solutions. However, the outcomes were highly dependent onhow the context of the problem was articulated and how well the conditions of the Demo Daycould be compared to conditions of actual use15. So the groups designing bridges for emergencydeployment were told that their task was to sell their ideas to a client and the resultant designswere very often incapable of being scaled up for real-world use. In another case, designs for waterpurification that would have worked in the field were rejected by students because they didn’thave enough time to show effectiveness on Demo Day. Such issues were identified by staff andrectified in subsequent course offerings. Overall the Demo Day assessment set conditions forlearning that helped support the development of outcomes such as communication skills andteamwork that are so hard to see demonstrated in a report.MechanismsThe mechanisms are the factors influencing the choices people make in response to theintervention. At the highest level, these choices were found to be influenced by considerations ofoutcomes that were important to participants, their desire to change their work practices, andawareness of broader engineering practice. For this research, outcomes considerations wereseparated into sustainability outcomes and all others since sustainability was identified by coursecontrollers as a desired learning outcome and the literature shows that educators struggle with theidea of educating for sustainability14. Table 3 shows the significant mechanism clusters that wereidentified. Table 3: Mechanisms of significance at the cluster level Cluster Description M1 Outcomes This cluster of mechanisms includes the outcomes-focused motivated consideration that changed the balance of choices open to considerations participants. Where teaching staff and students could identify tangible and relevant benefits, outcomes tended to improve. M2 Sustainability This cluster includes factors related to sustainability that changed motivated the choices people made. This cluster indicates that there is some considerations progress yet to be made with respect to this outcome. M3 Desire to This cluster includes decisions about how work was to be carried Improve Work out. The relevant contrast here is between process as part of the Practices learning and production of an output. M4 Awareness of This cluster relates to decisions and choices made in the light of Broader Engineering participants’ understandings of how the projects and associated Practices learning fitted into actual engineering practice.A mechanism that was very widespread in our data was one we called the “divide and conquer”mechanism. This is the familiar process where students look at the assessment requirements as areport that has to contain certain sections, decide which of them has the best skills for the varioustasks involved and divide the work up accordingly. We are often told that this is standard practicein industry where the object of group work is to produce a product of some kind, but we questionits effectiveness for fostering learning outcomes.In one university, staff actually facilitated “divide and conquer” by advising students how todivide the work between them. They were then surprised to find that when the teams came to giveoral presentations of their designs, members of the team could answer questions on their ownsections of the work only and had ignored what other students were doing. The most immediateresult of “divide and conquer” is that students miss out on part of the learning of content andskills, but there were implications for the work process as well. Although nearly all of the coursecontrollers said that they wanted their student to learn to work collaboratively, manage teams andcommunicate, the “divide and conquer” mechanism inhibited such learning. Team meetings werea matter of checking that everyone was making progress on their part of the report rather than anexchange of ideas. Communication was reduced in many cases to memos and minutes,enthusiasm was hard to maintain and complaints about team members not living up to theirresponsibilities created problems for academic staff.The context that was most significant for triggering and supporting the “divide and conquer”mechanism was, in our opinion, a matter of not setting the correct assessment target, that is beingmore concerned with the output than the outcome. However, this was not the only contextualfactor in play and we could also point to issues of insufficient institutional resourcing (C1), andinstances where the purpose of the design task was lost in a focus on the acquisition of technicalskills, as also having an influence. A particular instance from our data is illustrated in Table 4. Table 4: The logic of an unintended outcomeContext + Mechanism = OutcomeSome lack of clarity in Students use the “divide and When doing their oralexpectations, the “don’t conquer” mechanism as presentations it wasknow what they’re looking their only group process. found that manyfor” context, which tutors members of mosttry to clarify by either telling groups could onlythem how to approach answer questionsteamwork through about the small sectioncapitalising on existing of the report they hadcapacities (“feeding actually worked on.information”) or specifying They could not discussa variety of different criteria and therefore probablyaccording to their own had not achieved thepreferences and learning outcomes inunderstanding the other aspects of the(“idiosyncratic processes”). project.Moreover, contexts and mechanism interact with each other, so that even in cases where theassessment was focussed on the report and “divide and conquer” was the chosen mechanism, theoutcomes were improved by the influence of other contexts and mechanisms. For instance,amongst the M1 cluster, Outcomes focussed mechanisms, there was a strong trend for students torespond to courses on the basis of their understanding that engineering was a profession thatallowed them to make the world a better place. They also wanted to take on responsibility fortheir work. Those two mechanisms helped to mitigate the worst defects of the “divide andconquer” response.ImplicationsWe began by noting calls, which have been current for some decades, for engineering educationto change in ways that could be described as paying more attention to the conditions in whichengineers work (in multidisciplinary projects requiring sophisticated communication) and theimpact their decisions can have on the world. Overall we conclude that use of the EWBChallenge projects provides good opportunities for pursuing the desired changes to learningoutcomes for engineering students, although other kinds of projects could be equally successful aslong as some basic principles are followed. Achieving best outcomes, regardless of the type ofproject chosen, is more likely where there is:  Commitment to and clear and detailed communication of rationale for the intervention and its methods,  Well-aligned course and assessment design that does not rely on content alone to structure learning outcomes,  Attention to outcomes rather than outputs, and  Coherence in teaching approaches across the teaching team and in line with stated objectives.In other words, the change we need is not the inclusion of new content or a focus on non-technicalskills, but rather the embodiment of the kind of engineer society is demanding in ourselves asteachers and in our courses.Bibliography1. Institution of Engineers, Australia, (1996) Changing the Culture: Engineering Education into the Future,Institution of Engineers; Barton, ACT, Australia.2. Morgan, R.P., Reid, P.P., and Wulf, W.A.,(1998) The changing nature of engineering, ASEE Prism, 5, pp12-17.3. National Academy of Engineering (2005) Educating the Engineer of 2020: adapting engineering education tothe new century. Washington, D.C.; National Academies Press.4. King, J. (2007) Educating Engineers for the 21st Century, Royal Academy of Engineering. London UK.5. King, R. (2008) Engineers for the Future: addressing the supply and quality of Australian engineering graduatesfor the 21st century. ACED; Epping, Sydney6. Soundarajan, N. (2004) Program assessment and program improvement: closing the loop. Assessment andEvaluation in Higher Education 29(5): 597-610.7. Jolly, L., Crosthwaite, C., Brown, L. (2009) Building on strength, understanding weakness: realistic evaluationand program review. Proceedings of 20th Annual Conference of the Australian Association for EngineeringEducation: 911-917. Adelaide; AaeE.8. Rogers, P. (2007) Theory-Based Evaluation: Reflection Ten Years On, pp. 63-82 in S. Mathison (ed.) EnduringIssues in Evaluation. New Directions in Evaluation No114.9. University of Wisconsin (2010) University of Wisconsin-Extension, Program Development and EvaluationModel.10. Markiewicz, A. (2010) Monitoring and Evaluation Core Concepts. Professional Development materials used intraining workshops for the Australasian Evaluation Society.11. Pawson, R. and Tilley, N. (1997) Realistic Evaluation. London; Sage.12. Sochacka, N. W. (2011). Realistic Analysis of Socio-Technical Interventions in the Context of Urban WaterManagement. PhD dissertation, University of Queensland.13. Biggs, J. (1996) Enhancing teaching through constructive alignment. Higher Education 32(3): 347 – 364.14. Desha, C. (2010) An Investigation Into the Strategic Application and Acceleration of Curriculum Renewal InEngineering Education for Sustainable Development. PhD Dissertation, Griffith University.15. Jolly, L., C. Crosthwaite, L. Brodie, L. Kavanagh, and L. Buys. (2011) The impact of curriculum content infostering inclusive engineering: data from a national evaluation of the use of EWB projects in first year engineering,in AaeE 2011: Developing Engineers for Social Justice: Community Involvement, Ethics & Sustainability, Y.M. Al-Abdeli and E. Lindsay, Editors., Engineers Australia: Fremantle, Australia.

Brodie, L., & Jolly, L., & Crosthwaite, C., & Kavanaugh, L. (2013, June), Invited Paper - Improving First-year Engineering Education Using the Engineers Without Borders Australia Challenge: what worked for whom under what circumstances Paper presented at 2013 ASEE International Forum, Atlanta, Georgia. 10.18260/1-2--17248

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015