also like to acknowledge contributions from colleagues in theEngineering Learning and Practice Group and Dr. Lesley Jolly of the University of Queenslandfor invaluable help with survey design and methodology.References1. J. P. Trevelyan and S. Tilli, Published Research on Engineering Work. Journal of Professional Issues in Engineering Education and Practice, 2007. Vol. 133, No. 4, pp. 300-307.2. J. P. Trevelyan, Technical Coordination in Engineering Practice. Journal of Engineering Education, 2007. Vol. 96, No. 3, pp. 191-204.3. J. P. Trevelyan. A Framework for Understanding Engineering Practice. in American Association for Engineering Education (ASEE) Annual Conference. 2008. Pittsburgh.(submitted for review).4. R
Page 13.689.7 standard deviation).• One or more statistical measures (e.g. maximum, range, standard deviation) of height (surface elevation) are used to quantify the roughness of the image. The measure(s) selected are aligned with a clearly stated definition of roughness.• Frequency, 2-d size, and/or distances between significant features in the images is addressed. Procedures that address these issues must also use a measure related to height to quantify roughness. This is necessary as measures of frequency, 2-d size, and distance between features alone cannot define roughness. Either the procedure accounts for these issues or a rationale is provided for not considering these issues within the procedure.• The fact
. Page 13.32.137. Shuman, L. J., Besterfield-Sacre, M., and McGourty, J. (2005, January). The ABET “Professional Skills”- Can They Be Taught? Can They Be Assessed? Journal of Engineering Education, p. 41-55.8. Brumm, T. J., Hanneman, L. F. & Mickelson, S. K. (2006). Assessing and Developing Program Outcomes through workplace competencies. International Journal of Engineering Education, 22, 1, p. 123-129.9. Shuman, L. J., Besterfield-Sacre, M., and McGourty, J. (2005, January). The ABET “Professional Skills”- Can They Be Taught? Can They Be Assessed? Journal of Engineering Education, p. 41-55.10. Rogers, G. (2006, August). “Direct and Indirect Assessments: What Are They Good For?” Community Matters: A Monthly
revitalize our college through the energizing pedagogy of service-learning. The thesis is that service-learning spread throughout the core curriculum is more effective than one intensive course, which is more effective than none at all, that a mixture of required and elective service-learning (s-l) is more effective than either one or the other, and that service-learning will result in less coursework time than traditional programs satisfying ABET 2000 criteria.They define service-learning as a hands-on learning approach in which students achieveacademic objectives in a credit-bearing course by meeting real community needs. Theyhave integrated service-learning into many kinds of courses. They include designcourses
foundation forfuture coursework. This view of teaching and learning was investigated to see if some facultysee teaching as transmitting information and students' learning as receiving this information[17],without much focus on how the information really functions. Hendersen, et al.'s work used aphysics problem to focus an investigation into faculty perceptions of teaching and learningproblem solving. The problem required an average student to use exploratory decision makingas opposed to an algorithmic or "plug and chug" approach. Many faculty were oriented towardsthe algorithmic approach instead of focusing on problem solving; "Much of the material inprerequisite courses prior to the beginning of core engineering courses is oriented greatlytowards
recommend something to you but you have to do everything you want, you have to choose what you want to do.’ He always says this to me. I didn’t like this actually because this is like being responsible...He told me that if he train me like this, this is better for my future...if I do this decision[s] on my own.”The guidance that Trisha’s advisor provided in research and the program allowed her toself-manage. Trisha’s interview indicated that her advisor offered a guided approach that did notinfringe, but rather pushed Trisha to be autonomous throughout her graduate experience. We seeclear evidence of this in her description of her advisor’s statements of choice within her research,“you have to choose what you want to do
literature, PDI deviates from current collaborativelearning approaches in one notable way: the authority in the classroom shifts from the facultymember(s) to Student Instructors (SI). These Student Instructors are students that previouslycompleted the course and returned to take on the responsibility for the design and delivery oflearning experiences in the classroom. Faculty, therefore, assume a coaching role with the SIsand no longer act as the source of knowledge, educational material, and content delivery for thecourse.This research paper delves into the impact that this learning experience has on studentmotivation. Using a survey developed based on the MUSIC Model of Academic MotivationInventory®, the authors asked students to report their
] Freeman, S., S. L. Eddy, M. McDonough, M. K. Smith, N. Okoroafor, H. Jordt, M. Wenderoth. Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America. PNAS 2014 111 (23), 8410-8415, 2014.[2] Prince M. Does active learning work? A review of the research. Journal of Engineering Education, 93:223–231, 2004.[3] Knight J.K., Wood, W.B. Teaching more by lecturing less. Cell Biology Education, 4(4), 298-310, 2005.[4] Michael J. Where's the evidence that active learning works? Advances in Physiology Education, 30(4), 159-67, 2006.[5] McConnell, J.. Active learning and its use in computer science. In
grant from the National Science Foundation (Award # EEC-1730576). Any opinions, findings, and conclusions or recommendations expressed in thismaterial are those of the author(s) and do not necessarily reflect the views of the NationalScience Foundation. The authors are grateful to Catherine McGough and Rachel Lanning fortheir assistance in collecting and analyzing survey data.References[1] W. Sarasua, N. Kaye, J. Ogle, N. Benaissa, L. Benson, B. Putman and A. Pfirman, “Engaging Civil Engineering Students Through a ‘Capstone-like’ Experience in their Sophomore Year.” Proceedings of the 2020 Annual American Society of Engineering Education (ASEE) Conference and Exposition, Virtual Conference, June 21 – 24, 2020.[2] Ogle, J.H., Bolding
].[5] F. Toney, The Superior Project Organization: Global Competency Standards and Best Practices. New York: Marcel Dekker, 2002, pp. 18.[6] T. Anderson, “Understanding Power Dynamics Will Make You More Persuasive,” Kellogg Insight. [Online]. Available: https://insight.kellogg.northwestern.edu/article/understanding-power-dynamics-will-make-you- more-persuasive. [Accessed Mar. 16, 2020].[7] F. T. Anbari, E. V. Khilkhanova, M. V. Romanova, M. Ruggia, C. H.-H. Tsay, and S. A. Umpleby, “Cultural Differences in Projects - culturally aware leadership.” in PMI® Research Conference: Defining the Future of Project Management, Washington, DC. Newtown Square, PA: Project Management Institute, July 14, 2010. [Online]. Available: https
approaches to provide non-trivial classification of large data sets. His main teaching interests are crystal plasticity, sta- tistical mechanics, gas dynamics and kinetic theory, numerical methods in engineering, thermodynamics, solid mechanics, mechanics of materials. He is also interested in developing online courses and using online tools for facilitating active learning techniques in engineering classrooms. c American Society for Engineering Education, 2020 E-Learning And Assessment in the Cloud: Engineering Courses S. Papanikolaou1,2 1 Department of Mechanical & Aerospace Engineering, West Virginia University 2 Department of Physics, West
national or regional conference, as well as a poster/presentation atWKU’s annual research week. The faculty is expected to submit publication(s) upon completionof the grant. [15]“The Undergraduate Research Award (URA) program at Miami University encourages studentsto seek out a faculty-mentored experience in developing a research grant proposal [16]”. Thegrant amount varies between ($150 to $500), which are supposed to be spent for researchexpenses, i.e. purchasing supplies and materials for research. All enrolled full-timeundergraduate students in all disciplines on all campuses who have a GPA of at least 2.0 areeligible to apply. The student should approach a faculty member and write the proposal inaccordance with the faculty and submit with
), April, 2009, San Diego, CA.[3]. Creswell, J. W. (2012). Qualitative inquiry and research design: Choosing among five approaches (3rd ed.). Thousand Oaks, CA: Sage Publication.[4]. Deakin-Cric, R., Broadfoot, P., & Claxton, G. (2004). Developing an effective lifelong learning inventory: the ELLI project. Assessment in Education, 11(3), 247-272.[5]. Denzin N. K., & Lincoln, Y. S. (2011). The sage handbook of qualitative research (4th ed.). Thousand Oaks, CA: Sage Publication.[6]. Froyd, J., Borrego, M., Cutler, S., Prince, M., & Henderson, C., (2013). Estimates of use of research-based instructional strategies in core electrical or computer engineering courses, Accepted for publication in IEEE
Undergraduate Engineering Education – WIPThis work-in-progress investigates the applicability and relevance of Harvard professor HowardGardner’s theory of multiple intelligences (MIs) to undergraduate engineering education.Gardner developed the theory of multiple intelligences in the early 1980’s, initially identifyingseven distinct intelligences (also referred to as learning styles in the MI literature): 1) visual-spacial; 2) bodily-kinesthetic; 3) musical; 4) interpersonal; 5) intrapersonal; 6) linguistic; and 7)logical-mathematical. Subsequent researchers have sought to add to this list (for example,“naturalistic”), but only Gardner’s original seven MIs will be addressed within this investigation.According to
: an overview. Theory into Practice, Vol. 41, 4. College of Education. The Ohio State University.11. Clark, A.C. & Ernst, J.V. (2010). Engineering and technical graphics education; using the revised Bloom’s taxonomy. Journal for Geometry and Graphics, Vol. 4, No. 2, 217-226.12. Ferguson, C. (2002). Using the revised taxonomy to plan and deliver team-taught, integrated, thematic units. Theory into Practice, Vol. 41, 4. College of Education. The Ohio State University.13. Huitt, W. (2011). Bloom et al.'s taxonomy of the cognitive domain. Educational Psychology Interactive. Valdosta, GA: Valdosta State University. Retrieved from http://www.edpsycinteractive.org/topics/cognition/bloom.html14. Writing objectives using Bloom’s
).Emergent Codes Definition Example Quote(s)Expecting too Expecting other teammates “Oh, well, you're not working onmuch from others to contribute beyond their anything.” I'd be like, “Well, I did my part “fair share”, especially to already, so that's why I'm not working on avoid responsibility it.” And then they'd be like, “Well, can themselves you pick up my slack and do what I was supposed to do?”Failing to advance Passively failing to add “She just didn’t really contribute astoward project’s value to activities that move much... I know
-specific sections. One example of a professional networking design paper, offered in anearly version of this class as a model, included sections titled “Understanding [project]’s gains,”“Implementation details”, and even “Making it work.”18Recently in the broader context of STEM writing, communication scholars have recognized thisvariation and criticized the uniform approach, first for its tendency to apply the Classicalparadigm too liberally to the rhetorics of STEM,19, cited in 17 and second, because the speed withwhich STEM genres and modes of argument - particularly visual modes of argument - outpacesexisting methods communication scholars use to analyze them.20, cited in 17 Indeed, more recentlySwales himself has encouraged methods of move
Engineering Education Annual Conference. Austin, TX. (2009).2 Blikstein, P. Assessing open-ended scientific computer modeling in engineering education: the role of representations. In Proceedings of the 117th American Society of Engineering Education Annual Conference. Louisville, KY. (2010).3 Papert, S. Mindstorms: Children, computers, and powerful ideas. (Da Capo Press, 1993).4 Sherin, B. L. A comparison of programming languages and algebraic notation as expressive languages for physics. International Journal of Computers for Mathematical Learning 6, 1-61 (2001).5 Jonassen, D., Cho, Y. H. & Wexler, C. Facilitating problem-solving tranfer in physics. In Proceedings of the 115th American Society
education: Learning anywhere, anytime. Journal of Engineering Education, 94(1), 131-146.[2]. Allen, I. E., & Seaman, J. (2011). Going the distance: Online education in the United States, 2011. Sloan Consortium. PO Box 1238, Newburyport, MA 01950.[3]. Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7 – 23.[4]. Ickenberry, S. (2001). Forward. In Latchem, C. and Hanna, D., (Eds.), Leadership for 21st century learning: Global perspectives from educational perspectives. Sterling, VA.: Stylus Publishing.[5]. Keller, G. (2008). Higher education and the new society
” Confrontation Participants “are brought to realize possible inadequacies in their existing conceptions and/or teaching practices and thus create an awareness for the need to change” Exposure Workshop facilitator “provide[s] a direction and a model for improvement” Commitment Workshop facilitator “encourage[s] teachers building to engage in changes and development” In the following sections, we describe a professional development program that implements Ho’sconcept in order to foster a constructivist framework and student-centered
and Technology (DET) survey. Journal of Engineering Education, 100(4), 800-818.3. Crede, E., & Borrego, M. (2013). From ethnography to items: A mixed methods approach to developing a survey to examine graduate engineering student retention. Journal of Mixed Methods Research, 7(1), 62- 80.4. Daigneault, P.-M., & Jacob, S. (2014). Unexpected but most welcome mixed methods for the validation and revision of the participatory evaluation measurement instrument. Journal of Mixed Methods Research, 8(1), 6-24. Page 26.35.175. Ungar, M., & Liebenberg, L. (2011). Assessing resilience across
. 99, no. 2, pp. 159–168, 2010.[5] S. P. Brophy, P. Norris, M. Nichols, and E. D. Jansen, “Development and initial experience with a laptop-based student assessment system to enhance classroom instruction,” in American Society of Engineering Education Annual Conference, Nashville, TN, 2003.[6] S. W. Draper and M. I. Brown, “Increasing interactivity in lectures using an electronic voting system,” J. Comput. Assist. Learn., vol. 20, no. 2, pp. 81–94, 2004.[7] L. Malmi and A. Korhonen, “Automatic feedback and resubmissions as learning aid,” in IEEE International Conference on Advanced Learning Technologies, 2004. Proceedings, 2004, pp. 186–190.[8] A. Mitrovic, “An intelligent SQL tutor on the web,” Int. J. Artif. Intell. Educ
Page 26.508.5published within engineering education scholarly literature. We borrowed and adapted itemsfrom a number of existing measures, which included the following (for an item-by-itemdescription, see Appendix A): Zhai and Scheer’s (2004) Global Perspective Scale12 Downey et al.’s (2006) global competency questions13 Braskamp, Braskamp, & Merrill’s (2008) Global Perspective Inventory, and in particular their Interpersonal Social Responsibility Scale14 Hilpert, Stump, Husman, and Kim’s (2008) Engineering Attitudes Survey15Throughout the survey development process, the authors were in dialogue with one another,providing feedback for item clarity, framing, and refinement. Along with evaluating the fitbetween
campusDr. Patrick Cunningham, Rose-Hulman Institute of TechnologyDr. Douglas Karl Faust, Seattle Central College PhD in Physics, professor of Mathematics, physics, astronomy and computer science.Dr. Trevor Scott Harding, California Polytechnic State University Dr. Trevor S. Harding is Professor of Materials Engineering at California Polytechnic State University where he teaches courses in materials design, biomedical materials, and life cycle analysis. He has pre- sented his research on engineering ethics to several universities and to the American Bar Association. He Page 26.1323.1 serves as Associate Editor of
acknowledge a limitation of our analysis. We recognize that such groupgrade-setting meetings are very likely not the norm for courses in the calculus sequence,in other courses that serve as pre-requisites for engineering, or in engineering courses. Inthis sense, we would not expect our findings to generalize to the specific ways in whichstudents are “weeded out” at other institutions. At the same time, we believe that ourstrategy of analyzing practical dilemmas of grading and sorting, whether this work iscarried out individually or in groups, is a potentially productive one in understandingideological aspects of success and failure.Bibliography1. Meyer, M., & Marx, S. (2014). Engineering dropouts: A qualitative examination of why
ranks. The participants worked on acollaborative team project(s) to implement teaching innovations at a Midwestern large research-intensive, predominantly white institution (PWI). The project durations ranged from one to threeyears for sustainable implementation of teaching innovations. The semi-structured interviewscovered the participant’s previous teaching experience prior to joining the SIIP community, adescription of their current role in the community including what did and did not work well, anda description of their vision for the community in the future. Consistent with phenomenologicalresearch, the interviews were evaluated holistically to allow essential themes of the experience toemerge.Preliminary results of the phenomenological
research agenda that can propagate engineering educational innovations acrossthe community and to the other STEM fields. Hence, broader impacts will be fully realized uponactuation of the research agenda. However, this work moves beyond broader impacts in that itassists in meeting a national need to increase the U.S.’s economic competitiveness, the STEMworkforce, and potentially partnerships between academia and industry. It is in this latter sensethat the project clearly meets the national need to remain economically competitive.References:i American Society for Engineering Education (ASEE). (2012). Innovation with impact: Creating a culturefor scholarly and systematic innovation in engineering education. Washington, DC: American Society
- ment at George Mason University, USA. She is an educational researcher and pedagogical scholar with signature work in self-study research methodology including co-editor of Polyvocal Professional Learn- ing through Self-Study Research (2015) and author of Self-Study Teacher Research (2011) and lead editor of Learning Communities In Practice (2008). She is recipient of the Dissertation Research Award, Uni- versity of Virginia, the Outstanding Scholar Award, University of Maryland, a Fulbright Scholar, and a Visiting Self-study Scholar. She served as chair of S-STEP from 2013-2015 and is a current Co-PI of two National Science Foundation (NSF) funded grants: Designing Teaching: Scaling up the SIMPLE Design Framework