New Orleans, Louisiana
June 26, 2016
June 26, 2016
June 29, 2016
978-0-692-68565-5
2153-5965
First-Year Programs Division Technical Session 8: Ways to Measure "Things" About Your Course(s)
First-Year Programs
15
10.18260/p.26641
https://peer.asee.org/26641
830
Tony Lowe is a PhD candidate in Engineering Education at Purdue University. He has a BSEE from Rose-Hulman Institute of Technology and a MSIT from Capella. He currently teaches as an adjunct at CTU Online and has been an on-and-off corporate educator and full time software engineer for twenty years.
David Evenhouse is a dual-degree Graduate Student and Research Assistant in the Purdue University School of Engineering Education and the School of Mechanical Engineering. He graduated from Calvin College in 2015 with a B.S.E. concentrating in Mechanical Engineering. He has both led and enrolled in study abroad experiences in Spain, taking classes at the Universidad de Oviedo and the Escuela Politécnica de Ingeniería de Gijón. His current research investigates the implementation and effects of select emergent pedagogies related to student and instructor performance and experience in undergraduate education. Other interests include engineering ethics, the philosophy of engineering education, and the intersecting concerns of engineering industry and higher education.
Dhinesh Radhakrishnan is a postdoctoral research associate in the School of Engineering Education at Purdue University.
Stephen R. Hoffmann is the Assistant Head of the School of Engineering Education at Purdue University, with responsibilities for the First-Year Engineering Program.
This evidence based practice describes a process to evaluate a course within the spirit of ABET Criteria 4, continuous improvement. Faculty and staff often are asked to collaborate on the design of core engineering classes and share teaching across many sections. Over time these courses evolve to accommodate new subject matter, pedagogical approaches, political and personal preferences among other criteria as dictated by a dynamic group of stakeholders. Many changes originate from a clear mandate, while others sneak in without a full analysis of the course. Repeated and often subtle changes may have a significant impact on the course, creating a narrative of the faculty’s intent as the course goals and methods are updated semester after semester.
This paper describes a process using engineering educational research methods to understand the nature and motivation of course changes. We define a six step process focused around using the artifact analysis methodology to provide instructional teams with data to better understand the construction of their course and how it has changed over time. A case study examining a large-format First Year Engineering course design is included describing the process in action. The case study includes methodological choices, analysis, and findings as a guide to practitioners seeking to follow our process for gathering data. The data can be used to inform future changes to the course design to ensure alignment of the course objectives, assessment, and pedagogy, while at the same time systematically meeting ABET Criteria 4.
Lowe, T. A., & Evenhouse, D. A., & Radhakrishnan, D. B., & Hoffmann, S. R. (2016, June), Data-Driven Course Improvements: Using Artifact Analysis to Conquer ABET Criterion 4 Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.26641
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015