New Orleans, Louisiana
June 26, 2016
June 26, 2016
June 29, 2016
978-0-692-68565-5
2153-5965
Electrical and Computer
16
10.18260/p.25667
https://peer.asee.org/25667
461
Dr. Sohoni is an Assistant Professor in Engineering and Computing Systems at Arizona State University’s College of Technology and Innovation. Prior to joining ASU, he was an Assistant Professor at Oklahoma State University. His research interests are broadly in the areas of computer architecture and performance analysis, and in engineering and computing education. He has published in ACM SIGMETRICS, IEEE Transactions on Computers, the International Journal of Engineering Education, and Advances in Engineering Education. His research is supported through various internal and external funding agencies including the National Science Foundation.
He is a popular and well-respected instructor, and has received many teaching awards including the Regents Distinguished Teaching Award in 2010 at OSU.
Scotty D. Craig is an Assistant Professor in the Human Systems Engineering Program within the Fulton Schools of Engineering at Arizona State University. Dr. Craig received his Ph.D. in Experimental Psychology with a focus in Cognitive Psychology from the University of Memphis in the Department of Psychology and a Post-Doctorial Fellowship at the Pittsburgh Science of Learning Center.
His goal is to provide cutting edge research at the intersection of human cognition, technology, and the learning sciences which provides solutions to real world problems within education and training. His current research focuses on improving learning with higher-level cognition factors such as discourse and cognitive affective states through the use of virtual humans within technological environments. More information on Dr. Craig’s work can be found at www.cobaltlab.org.
There is a large body of literature available on effective teaching and learning both within ASEE’s conference proceedings and journals and in wider outlets. One primary research goal is to get engineering educators to adopt effective pedagogies in their classrooms. However, this is not happening at the rate or scale that the engineering education community is hoping. Furthermore, even in cases where it is happening, there is not much evidence indicating if the implementation is correct (e.g. techniques are implemented as intended or have the desired impact in the classroom). Our hypothesis is that, even for faculty who are interested in adopting innovative teaching methods, literature that is not in their specific sub-discipline is difficult for them to implement and evaluate because of a deficiencies in conceptual understanding in the larger context of the research literature and methodologies. They might understand or appreciate some of the theory or the general learning principles from these publications, but how the principles can be applied within their classrooms is often unclear to them. More importantly, they do not know how to assess the impact of the changes. Setting up research studies involving human subjects, designing the within classroom evaluations, or simply designing the right questions to ask on a pretest and posttest, are activities most practitioners are not trained in. Most practitioners may not perceive that there is time to implement these principles and evaluations. Thus, there exists a gap, even between most literature in engineering education and what can translate into classrooms. We believe that specifically focused, discipline-based, or even course-granularity based guiding papers are necessary to provide educators the tools and the confidence to employ effective teaching techniques and evaluate the impact of these techniques. This work, a collaboration between a computer architect who has expanded his research into engineering education, and an cognitive psychologist who specializes in the learning sciences and technology, aims to provide an example of such a ‘guiding paper’. As an illustration of the kind of specific information and tools necessary for broader adoption, we present details of an experimental design, the pre-post test questions, and a discussion of the choices we had and the decisions we made. In this example scenario, we propose to investigate the impact of an intervention in a computer organization course. By analyzing a previous experimental setup, we will illustrate specific lessons learned that could facilitate the implementation and evaluation.
Sohoni, S. A., & Craig, S. D. (2016, June), Making the Case for Adopting and Evaluating Innovative Pedagogical Techniques in Engineering Classrooms Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.25667
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015