.) Developing Models in Science Education (Dordrecht: Kluwer). 3–18.2. Koretsky, M.D., D. Amatore, C. Barnes, and S. Kimura, “Enhancement of student learning in experimental design using a virtual laboratory,” IEEE Transactions on Education 51, 76 (2008).3. Kelly, C., E. Gummer, P. Harding and M.D. Koretsky, “Teaching Experimental Design using Virtual Laboratories: Development, Implementation and Assessment of the Virtual Bioreactor Laboratory,” Proceedings of the 2008 American Society for Engineering Education Annual Conference & Exposition (2008).4. Koretsky, M.D., C. Kelly, P. Harding, and E. Gummer, "Comparison of Student Perceptions of Virtual and Physical Laboratories, “Proceedings of the 2009 American Society for Engineering
AC 2010-1970: REFINEMENT AND INITIAL TESTING OF AN ENGINEERINGSTUDENT PRESENTATION SCORING SYSTEMTristan Utschig, Georgia Institute of Technology Dr. Tristan T. Utschig is a Senior Academic Professional in the Center for the Enhancement of Teaching and Learning and is Assistant Director for the Scholarship and Assessment of Teaching and Learning at the Georgia Institute of Technology. Formerly, he was Associate Professor of Engineering Physics at Lewis-Clark State College. Dr. Utschig has regularly published and presented work on a variety of topics including assessment instruments and methodologies, using technology in the classroom, faculty development in instructional design, teaching
unintended consequence of built-in obsolescence. Theineffectiveness of many designs has been resident in a static view of learning and teaching styles,personnel-dependence, an inability to manage changes in program size, and/or a lack ofportability and adoption by the larger educational community. To avoid these specific pitfalls inour design for educational enhancement, we are: (1) employing a dynamic view of learning andteaching styles where the characteristics of student and faculty are periodically measured toestablish an assessment process calibration, (2) using knowledge management systems to processvoluminous data collection and analysis in an efficient and flexible manner, (3) using a modulardesign of an established assessment paradigm that
Professional DevelopmentEleven (11) experienced and six (6) inexperienced GTAs were employed in Fall 2007.Experienced GTAs had been assigned a first-year engineering laboratory section and gradednearly all students’ work, including students’ work on MEAs, in at least one prior semester.Inexperienced GTAs had no prior experience with the first-year engineering course. All GTAsreceived four hours of professional development (PD) training prior to the start of the Fall 2007semester. The PD focused on several aspects: connecting engineering practice to teaching, theMEA pedagogy, audience information (first-year engineering students), and practical issues ofMEA implementation and assessment14. GTAs were trained to understand the open-ended andrealistic
-solvingcompetencies has been developed. First, an engineering conceptual and procedural taxonomywill be presented. The taxonomy is organized into seven taxa and three cognitive levels.Further, an exercise of conceptual and problem-solving analysis will be performed on a spring-pulley problem. Using this analysis, a model of a CPI was developed. An assessmentinstrument was then constructed to aid in the placement of students at their appropriate levels ofthe taxonomy. A sample laboratory assignment will be presented to show how such hands-onexperiences could effectively complement the classroom teaching activity. Finally, preliminarytesting results and concluding remarks will be reported.II. Development of the Conceptual and Procedural TaxonomyA. The eed for
(Entrepreneur, etc.) 5 (50%) 44 (33%) Government (Politician, Science Policy Advocate, etc.) 3 (30%) 16 (12%) Industry (Engineer/Research Scientist) 10 (100%) 114 (84%) Research Laboratory (Engineer/Research Scientist) 7 (70%) 67 (50%) Other (please specify) 0 (0%) 3 (2%) *Responses obtained from a survey sent to the 272 GSIs in the College of Engineering in Fall 2009 (~50% response rate)Since EGSMs are advanced doctoral students (many of whom have reached candidacy), whoalso have in-depth training and experience related to effective college teaching, consulting
AC 2010-1900: SPECIAL SESSION: MODEL ELICITING ACTIVITIES --INSTRUCTOR PERSPECTIVESRonald Miller, Colorado School of Mines Ronald L. Miller is professor of chemical engineering and Director of the Center for Engineering Education at the Colorado School of Mines where he has taught chemical engineering and interdisciplinary courses and conducted engineering education research for the past 24 years. Dr. Miller has received three university-wide teaching awards and has held a Jenni teaching fellowship at CSM. He has received grant awards for education research from the National Science Foundation, the U.S. Department of Education FIPSE program, the National Endowment for the Humanities, and the
research interests focus on the application of ePortfolio pedagogy and practices to facilitate teaching, learning, and assessment for students, faculty, and institutions. She is also interested in the exploration of the affordances and scalability of these kinds of social software tools and their implications for the design and evaluation of innovative learning spaces to support formal and informal learning.Kenneth Goodson, Stanford University Kenneth E. Goodson is professor and vice chair of mechanical engineering at Stanford University. His research group studies thermal transport phenomena in semiconductor nanostructures, energy conversion devices, and microfluidic heat sinks, with a focus on
AC 2010-1109: CHANGING HIGH SCHOOL STEM TEACHER BELIEFS ANDEXPECTATIONS ABOUT ENGINEERING LEARNING AND INSTRUCTIONMitchell Nathan, University of Wisconsin, Madison Professor Mitchell Nathan, PhD and BSEE, is currently Chair of the Learning Sciences program at the University of Wisconsin-Madison, and a founding officer of the International Society of the Learning Sciences (ISLS). Dr. Nathan studies the cognitive, embodied, and social processes involved in learning and teaching mathematics, science and engineering in classrooms and the laboratory, using analysis of discourse, survey and assessment instruments, and experimental design. Dr. Nathan examines teacher beliefs about student
experiences and opportunities. In other words, thereal challenge in college teaching today is not covering the material for the students, but ratheruncovering the material with the students 2.There are several strands of pedagogies of engagement under the umbrella of active learningmethods that have received attention by engineering educators world-wide 2, 3. Thesemethods/approaches are known to increase students’ active engagement in learning and alsopromote cognitive elaboration, enhance critical thinking, and contribute toward social andemotional development. For many faculty, there remain questions about what “active learning” isand how it differs from traditional engineering education, since the latter involves activitiesthrough homework
, and retention as the overall demand formoved from the defense needs of the cold war era to the explosive rise of global competition(National Research Council Board for Engineering Education, 1995). The need for change wasinitially recognized in three separate reports targeting engineering education (American Societyfor Engineering Education [ASEE], 1994); National Science Foundation [NSF], 1995; andNational Research Council Board for Engineering Education, 1995). Since those initial studies, Page 15.51.2other reports have called for more specific changes related to teaching and curriculum to supporta more diverse group of learners
AC 2010-1107: HOW STUDENT-FACULTY INTERACTIONS INFLUENCESTUDENT MOTIVATION: A LONGITUDINAL STUDY USINGSELF-DETERMINATION THEORYKatherine Winters, Virginia Tech Katherine Winters is a doctoral student and Graduate Teaching Fellow in the Department of Engineering Education at Virginia Tech. She has a M.S. in Civil Engineering and a B.S. in Civil and Environmental Engineering from Brigham Young University. Her research interests include engineering student motivation and identity.Holly Matusovich, Virginia Tech Holly Matusovich is an Assistant Professor in the Department of Engineering Education. Dr. Matusovich recently joined Virginia Tech after completing her doctoral degree in Engineering
settings committed to environmental protection. She teaches undergraduate and graduate courses including Aquatic Chemistry, Environmental Engineering Laboratory, and developed an interdisciplinary project based two course sequence, Sustainability Concepts: Mercury in Tampa Bay and Mercury in Guyana. She is the faculty advisor for USF's Chapter of Engineers for a Sustainable World and is an affiliate of the USF Office of Sustainability.Ken Thomas, University of South Florida Ken D. Thomas is currently at PhD Candidate and teaching assistant at USF’s Department of Civil & Environmental Engineering. Ken obtained BSc Chemical and Process Engineering as well as MSc Environmental Engineering from UWI
) whatmotivates students to study engineering; and (3) how students conceive of their engineeringfuture. While the findings from the APPLES research have been disseminated through suchtraditional venues such as conferences and journal publications, an innovative institution-specificworkshop model was designed and piloted in spring 2009. This paper describes this new formatfor disseminating national research findings which is specifically aimed at engaging faculty inconversations that directly lead to changes in local educational practices and policies. Feedbackfrom the faculty participants and the impact of the workshop on teaching and learning practicesin subsequent months are presented. The broader implications of a national-local workshopmodel for the
problems. And they may be sufficient for earning apassing grade in the course. However, when large numbers of students flounder on open-endedproblems that require deeper understanding of the material, it becomes clear that the educationalprocess is not working.Cognition research2,13,15 has addressed situations such as these in which students are faced withtasks that do not have apparent meaning or logic. For students to “learn with understanding,”they need to “take time to explore underlying concepts and to generate connections to other[knowledge] they possess.”2 For several years, our teaching strategy has focused on givingstudents first-hand experiences with electric motors and balancing devices in the laboratory. Wehad students generate
ECEdepartment offers EI&S course, a 3 credit course for non-EE majors. The course has a largeintake with approximately 100 students from mechanical, bio-system, material, applied, and civilengineering majors. The course is delivered in a traditional manner through lectures, labs, and apublished e-book made available to the students via university web. The course is managedthrough Angel, the university’s course e-management system, only to the extent of postingassignments, solutions and individual grades. The course is not assigned a fixed term faculty.Like most service courses, teaching responsibility is rotated among the departmental faculty on a2-3 year cycle.The course introduces the breadth of EE while providing hands-on experience in
Page 15.413.1three full time faculty as well as two long time part-time faculty who had been teaching coursesrelated to design, including the 286A/B sequence. The committee’s charge from the DepartmentChair was to create a new design stem of courses to support our program’s learning outcomes,without being constrained by the format of the existing course sequence. The committee was touse the Conceive-Design-Implement-Operate (CDIO) framework as a template for this review.CSUN has been a CDIO collaborator since 2005, and has adapted the CDIO syllabus2 to theneeds of our student population, which is characterized by significant racial and ethnic diversity,as well as large variances in academic preparation3. The application of CDIO principles to
Exposition, pp. 2599-2606, 2001.18. Miller, R., and Olds, B., “Encouraging Critical Thinking in an Interactive Chemical Engineering Laboratory Environment,” Proceedings of the Frontiers in Education Conference, pp. 506-510, 1994.19. Bruno, B., and Anderson, A., “Using Objective Driven Heat Transfer Lab Experiences to Simultaneously Teach Critical Thinking Skills and Technical Content,” Innovations in Engineering Education, pp. 177-189, 2005.20. Nelson, S., “Impact of Technology on Individuals and Society: A Critical Thinking and Lifelong Learning Class for Engineering Students,” Proceedings of the Frontiers in Education Conference, 3:S1B/14-S1B/18, 2001.21. Wiggins, and Grant. “Educative Assessment: Designing
AC 2010-823: USING THE EMERGENT METHODOLOGY OF DOMAINANALYSIS TO ANSWER COMPLEX RESEARCH QUESTIONSLindsey Nelson, Purdue University Lindsey Nelson is a graduate student in Engineering Education. She graduated from Boston University with her bachelor's degree in Mechanical Engineering. In trying to gain knowledge about teaching and learning within an engineering context, Lindsey pursued some graduate study in mechanical engineering and shifted to teaching high school physics. As an active member of the American Association of Physics Teachers, she developed an interest in curricular innovations. Combining her interest in curricular innovations with a passion for social justice, Lindsey
AC 2010-1639: USING GRAPHIC NOVELS TO COMMUNICATE ENGINEERINGEXPERIENCES IN AN URBAN MIDDLE SCHOOLJennifer Atchison, Drexel University Jennifer Atchison is a PhD candidate in the Department of Materials’ Science and Engineering at Drexel University and her area or research is focused on nanophotonics. She is a second year NSF Graduate Teaching Fellow in K-12 Education and is the Science Program Director for The Achievement Project.Dorothea Holmes-Stanley, St. Cyprian's School Dorothea Holmes-Stanley is the science teacher for 5-8th grades at St. Cyprians School in Philadelphia.Adam Fontecchio, Drexel University Dr. Adam Fontecchio is an Associate Professor and Assistant Department
AC 2010-1808: STEPWISE METHOD FOR DEAF AND HARD-OF-HEARINGSTEM STUDENTS IN SOLVING WORD PROBLEMSGary Behm, Rochester Institute of Technology Gary Behm is a Senior Project Associate and Director of the NTID Center on Access Technology Innovation Laboratory and a Visiting Lecturer at NTID. He is a deaf engineer at IBM who received his BS from RIT and his MS from Lehigh University. He currently serves as a loaned executive at NTID/RIT working in the Center on Access Technology and the department of Engineering Studies. At IBM, he is a delivery project manager in the Rapid Application Development Engineering System. Behm has six patents and has presented over 20 scientific and technical papers
received her Ph.D. from the School of Engineering Education at Purdue University.Brian Self, California Polytechnic State University Brian Self is a Professor in the Mechanical Engineering Department at California Polytechnic State University in San Luis Obispo. Prior to joining the faculty at Cal Poly in 2006, he taught for seven years at the United States Air Force Academy and worked for four years in the Air Force Research Laboratories. Research interests include active learning and engineering education, spatial disorientation, rehabilitation engineering, sports biomechanics, and aerospace physiology. He worked on a team that developed the Dynamics Concept Inventory and is currently
Graduate School of Education and Psychology, with joint appointment in the Department of Mathematics. Formerly Director of the Center for Research on Learning and Teaching at the US Air Force and a Division and Program Director at the National Science Foundation. Prior to coming to NSF, he directed an NSF-funded center in Chicago to promote the participation on underrepresented minorities in science, engineering and mathematical professions. His current work is supported by the Institute for Education’s Educational Technology program and NSF’s Course, Curriculum and Laboratory Improvement (CCLI) program; it focuses on collaborative learning technologies and interfaces, immersive learning
reveals that there isconsiderable debate about what “counts” as interdisciplinary teaching and research. Decisionsabout which theories and definitions to adopt have implications for how scholars defineinterdisciplinarity, what educators believe constitutes interdisciplinary education, and for whatresearchers choose to include and exclude in studies of the development of students’interdisciplinary competence. In this paper we present data excerpts from six detailed casestudies that reveal the many, varied, and often conflicting, definitions of interdisciplinarity usedby engineering administrators and faculty members in discussions of undergraduate educationalactivities intended to develop students’ interdisciplinary competence. These definitions
AC 2010-2137: OPEN-BOOK VS. CLOSED-BOOK TESTING: ANEXPERIMENTAL COMPARISONLeticia Anaya, University of North Texas Leticia Anaya, M.S. is a Lecturer in the Department of Engineering Technology at the University of North Texas College of Engineering. She is currently working in her PhD in Management Science at the University of North Texas. She received her M.S. in Industrial Engineering from Texas A&M University. Her research and teaching interests include Thermal Sciences, Statistics, Quality Assurance, Machine Design, Simulation and Educational Teaching Methods. She has published previously in ASEE Conferences and has developed three laboratory manuals in the following areas
ispresented in the following list. Items referenced with [29] are quoted from the THE ThomsonReuters Survey and those with [31] from the ARWU.1. Financial indicators a. Income from research grants and awards (may be intramural or external) [29] Page 15.1008.14 b. Total expenditures [29] c. Income from teaching [29] d. Analysis of income sources (government, private, competitive, industry) [29] e. Analysis of expenditures (staff salaries, teaching, reserch, library, real estate) [29] f. The size of the resource supporting the program i) Size of the endowment ii) Number and state of equipment of the laboratories and facilities
Page 15.1050.7GSE for measuring of modeling self-efficacy. In building our self-efficacy scale, we followedtwo essentials: first, we investigated other relevant scales in fields that are close to engineeringmodeling and academic setting, and second, we observed the guidelines suggested by Bandura.Pajares [28] provides a comprehensive list of the relevant efficacy scales for academic settings.We used his list of scales and added other available scales to create a comparison list of scales.This list is provided in Table 3.Table 3. Major Self-efficacy Scales for Various Academic Tasks Source Sample Question or Direction Answer Options Teaching Efficacy How much can you …? [Completed by various