conveniencesampling based on the data analysis available from another study. Of the 5 returning GTAs, 3were familiar with the Paper Plan Challenge MEA from its Fall 2006 implementation in which adifferent rubric was applied5. One of these three returning GTAs also had experience with thisMEA from the Spring 2008 implementation in which a rubric similar to the MEA Rubric wasused.An Expert independently assessed Fall 2008 student team MEA solutions. The Expert was adoctoral student in Engineering Education with 7 years of teaching experience in the first-yearengineering program and 4 years of experience with research on MEAs, including thedevelopment of the MEA Rubric.C. Data Collection & AnalysisMEAs are conducted via a web-based interface connected to a
conducted in every course in every semester.These are considereda “quick and dirty” monitoring system, but do allow a fast response where difficulties orweaknesses are identified. The results from these surveys are not used for staff promotionpurposes, and evaluative instruments are tailored by the Centre for Academic Developmentwhere a staff member wishes to seek feedback of a personal nature or to evaluate a specificinitiative in teaching or assessment. The School of Engineering also complies with theUniversity-wide three-year rolling plan of course evaluation. There is a perception thatcourses are comprehensively evaluated, and in fact there is some talk of “survey overload”.ChallengesOver the period of thirteen years that this program has run