ability to interact effectivelywith people from different cultural backgrounds were measured using a standardized surveyinstrument. Participants reported an increase in their average research competency ratings aftercompleting the program, as indicated by the survey findings. Those improvements cut acrossdemographics such as gender, race/ethnicity, socioeconomic status, and school type.Acknowledgments: This work was supported by the National Science Foundation’s InternationalExperience for Students (IRES) Site grant. (Grant Numbers: OISE# 1952490-TAMU, 2208801-NCAT,and 195249-UNLV). Any opinions, findings, conclusions, or recommendations presented are those of theauthors and do not necessarily reflect the views of the National Science Foundation
upon work supported by the National Science Foundation under Grant No.1848498. Any opinions, findings, and conclusions or recommendations expressed in this materialare those of the authors and do not necessarily reflect the views of the National ScienceFoundation. The authors wish to thank Dr. Elizabeth Litzler, the Project Evaluator, for hervaluable input, and Hannah Chiou for her assistance in reviewing codes. Additionally, we thankthe students, advisors and faculty who participated in the study for sharing their experiences.References[1] M. T. Cardador, "Promoted up but also out? The unintended consequences of increasing women’s representation in managerial roles in engineering," Organization Science, vol. 28, pp. 597-617
pedagogy.AcknowledgementsThis material is based upon work supported by the National Science Foundation under Grant No.1915614. The opinions, findings, and conclusions or recommendations expressed are those of theauthor(s) and do not necessarily reflect the views of the National Science Foundation.References[1]. Evaluation Consortium, University at Albany (2016) Experimental Centric Based EngineeringCurriculum for HBCUs Leadership Team, HBCU Year Three Report.[2]. Gough, A., & Gough, N. (2018). Beyond Tinkering and Tailoring: Re-de/signingMethodologies in STEM Education. Canadian Journal of Science, Mathematics & TechnologyEducation, 18(3), 284–290.[3]. Astatke, Y., & Connor, K. A., Newman, D., Attia, J. O., & Nare, O. E. (2016, June),Growing
studentexperience and what their experience has been like working on their research projects. Theinformation that is collected is used by the program staff to make any changes in mentor/menteeassignments and consider what additional programming might be needed for the participants.Participants also participate in a focus group interview with an external evaluator. Questions askparticipants to reflect on their experiences during the summer program, how the program hasimpacted their career and academic goals, and how the experience has developed confidence indifferent research skills.Data AnalysisA subset of eleven questions that considered students’ overall satisfaction, confidence, and self-efficacy in their research skills were considered for this study
success, we will develop and test interventions that develop these beneficial beliefs and attitudes in students. • Continue to work closely with our collaborating institutions (Purdue and UTEP) to develop and pilot test initiatives as a means of changing NCA factors for students to improve student success.AcknowledgementThis material is based upon work supported by the National Science Foundation under grantnumbers DUE-1626287 (Purdue University), DUE-1626148 (Cal Poly), and DUE-1626185(University of Texas – El Paso). Any opinions, findings, and conclusions or recommendationsexpressed in this material are those of the author(s) and do not necessarily reflect the views ofthe National Science Foundation. We
takeplace online in October or November of 2021. Instructors and students will complete anothersurvey, after instructors attend the workshop, and instructors will again complete a follow-upsurvey in the spring of 2022.AcknowledgementsThis research is supported by the U.S. National Science Foundation (grant numbers DUE-1821092, DUE-1821036, DUE-1821488, and DUE-1821277).Any opinions, findings, and conclusions or recommendations expressed in this material are thoseof the author(s) and do not necessarily reflect the views of the National Science Foundation.References[1] M. Prince, “Does active learning work? A review of the research,” Journal of Engineering Education, vol. 93, pp. 223-232, July, 2004, doi: 10.1002/j.2168-9830.2004.tb00809.x.[2
services they need to succeed. As faculty,we need to be advocate and champion for talented students who have been impacted bycatastrophic event if we want to retain and graduate them to become successful STEMprofessionals.AcknowledgementsThis material is based upon work supported by the National Science Foundation under GrantsNo. 1354156 (Nanotechnology Center); 1833989 (EECOS); 1833869 (PEARLS); 1832468 and1832427 (RISE-UP). Any opinions, findings, and conclusions or recommendations expressed inthis material are those of the authors and do not necessarily reflect the views of the NationalScience Foundation. The authors are greatly thankful to the advisory board members andevaluators for their valuable input and feedback. We are also greatly
increased byexperiences, there were two participants who had experienced a decrease in confidence, or whosaw that in others. The decreases in confidence were only reported by direct pathway students.One direct pathway student reported that an internship had impacted “the level of confidencewith which I proclaim results,” but in a way that reflected less confidence instead of more. Hereported having given specific numerical results as “a figure of speech”, and after beingchallenged on that in the workplace, changed his approach. As he stated, I throw a lot of disclaimers before I give specific numbers now because unless you have data to back it up, people will latch onto the numbers and then when it comes back and it’s only a 40
and students. 3In this first experience, we and the teacher coordinated several lessons in which we used freesoftware to introduce the science of waves. After this introduction, students developed a projectin which they created sound installations and reflected on how their installations vibrated andgenerated sound [2].We used free sound editor Audacity [11] for students to visualize waveforms (as anoscilloscope), create pure tones (as a signal generator), and create sound compositions. For aspectrogram, we used free software UltimaSound (See Figure 1). Using and installing thesoftware on the school’s computers was possible because we were
Junior Year Participant Comparison Discussion and Conclusions We are grateful to the National Science Foundation for supporting the SustainableBridges project. Please note that any opinions, findings, and conclusions or recommendationsexpressed in this material are those of the authors and do not necessarily reflect the views of theNational Science Foundation. The data presented here on the first three cohorts of theEngineering Ahead first-year bridge program for pre-major Engineering students is part of thelarger Sustainable Bridges project (#1525367). The preliminary results are promising for the first three cohorts of the first-year
high stakes nature of placement tests and do not adequatelyprepare for them (Avery & Kane, 2004; Safran & Visher, 2010; Venezia, Bracco, & Nodine,2010). This suggests that placement test results may be an inaccurate reflection of students’ mathskills and knowledge and should be interpreted with some caution. Third, faculty andadministrators typically use standardized tests as enrollment management tools in ways thatincrease the number of students in remedial classes both because they believe it reduces variationin academic preparation of students in the higher level classes and also because it is easier to hirestaff to teach at lower levels (Melguizo, Kosiewicz, Prather, & Bos, 2014). If this is true, thenstudents may be
Department Head who sees this as the top priority.The traditional approach to measuring diversity in engineering involves counting racial andethnic minorities and women, while measuring gains in representation as reflected by thenumbers. We believe that this traditional approach needs to consider other important aspects ofdiversity, in addition to the traditional approaches, to maximize the inclusiveness within thefield. Decades of educational policy and practice have under-considered the existence of groupssuch as LGBTQ, poor, and disabled, thereby perpetuating exclusionary social patterns (Riley etal., 2014). Our multi-pronged approach to increasing diversity and inclusion begins withexpanding the fundamental definition of diversity to include
for all students in CBEE?Ultimately, we aspire to both transform the activities systems in CBEE and to serve as a modelfor others in engineering education as we move towards an inclusive and creative engineeringprofession for the 21st Century.AcknowledgementsThe authors are grateful for the enthusiasm and participation in our work from so many membersof our CBEE School community – students, staff, and faculty. We also acknowledge the supportprovided by the National Science Foundation through grant EEC 1519467. Any opinions,findings, and conclusions or recommendations expressed in this material are those of the authorsand do not necessarily reflect the views of the National Science Foundation.References1. Engeström, Y. (2001). Expansive
rate of scholars (losses due to GPA, only) will also be assessed for evidence ofsuccessful interventions.AcknowledgmentThis work is supported by the National Science Foundation Award under Grant No. 1153250.Any opinions, findings, and conclusions or recommendations expressed in this material are thoseof the authors and do not necessarily reflect the views of the National Science Foundation.References [1] Geisinger, Brandi N. and Raman, D. R., “Why They Leave: Understanding Student Attrition from Engineering Majors,” International Journal of Engineering Education (1993): 29 (4), 914–925. [2] Chen, Xianglei and Soldner, Matthew, “STEM Attrition: College Students’ Paths Into and Out of STEM Fields,” National Center For Education
the sophomoreyear may work better for students once they understand, from the year-long counseling sessions,the need to catch up with their cohort. Unfortunately, participation in the summer bridge has notincreased significantly to date.As we reflect on the overall assessment plan, we realize that while some Program elements havethorough assessments, we need to disaggregate the data even more so that we better understandthe various cause and effect relationships.Initial ConclusionsWhile there are some promising initial results in terms of 1st to 3rd semester retention rates, it isclear that participation in the Program elements that help students catch up academically hasbeen low. Since implementation, we made several changes to the Program
college levelduring the 2014-15 academic year. The number of student-hours of instruction delivered at thefour-year level was double that delivered by community colleges and may reflect a greater abilityto apply the technology or the need for greater depth of instruction at the four-year level.The gender data shows that females are a distinct minority in microcontroller classes and that theclass is composed mainly of students of Caucasian ancestry. Students of Hispanic andAsian/Pacific Islander ancestry make up a higher percentage at the four-year level than in two-year community college microcontroller classes.Interest in professional development workshops similar to those offered through the projectseems to remain high. Registrations are
that we can measure the learningexperiences and outcomes in these 4 courses. Below are the evaluation results.Pre-EvaluationAll the participants are students from the computer science department at Georgia State University. Theassessment is divided into three parts: Work experience with computer and programming language used (written response) Knowledge of operating system (choice question) Study experience of PC and different ways to learn (choice question)The diversity in the nature of question reflects both the understanding of students about the operating system andthe best way for the students to learn it effectively.Written response – Operating System:Work Experience YES (%) NO (%)Have
Tutor showed a statistically significantadvantage for the post-test scores on node analysis [t(64) = 3.09, p < 0.05] with an effect size(Cohen d-value) of 0.72σ. For mesh analysis, the difference was not statistically significant [t(64)= 0.88, p = 0.38], which may reflect the fundamentally easier nature of that topic (both groupshad relatively high averages). The survey results showed a very strong preference for CircuitTutor and a strong belief that it taught them more effectively than System X. A typical studentcomment was “I liked Circuit Tutor more because I could do a ton of problems. I liked that evenif I couldn't figure it out, I could ‘give up’; and it would thoroughly explain how to do everythingso I could understand what I did
-1711533. Any opinions, findings, and conclusions or recommendations expressed in thismaterial are those of the author(s) and do not necessarily reflect the views of the NationalScience Foundation.References[1] Paulson, D. R., & Faust, J. L. (1988). Active and Cooperative Learning. Los Angeles: California State University, Los Angeles. Retrieved from http://www.calstatela.edu/dept/chem/chem2/Active/index.htm[2] Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223-231.[3] Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics
conclusions or recommendationsexpressed in this paper are those of the authors and do not necessarily reflect the views of theNational Science Foundation.References[1] Geometric Optics, PhET, Available at: https://phet.colorado.edu/en/simulation/geometric-optics [Accessed 5 Aug. 2017].[2] B. Alberts, “Prioritizing science education,” Science, vol. 328, pp.249-249, Apr. 2010.[3] I. E. Allen and J. Seaman, Class Difference$: Online Education in the United States. Babson Survey Research Group, 2010. Available: https://files.eric.ed.gov/fulltext/ED529952.pdf. [Accessed December 29, 2017][4] T. de Jong, M. Linn, and Z. Zachariam “Physical and virtual laboratories in science and engineering Education,” Science, vol. 340
, Oxnard College, Santa Barbara City College, and both the ComputerScience and Information Technology departments of CSUCI. One of the first areasdiscussed was that the curricula at the community colleges and the BSIT program havediverged. Reflective of this is the incoming students surprise at how few of theircommunity college courses are transferring as disciplinary credit. The primaryrecommendation from this review of the data is the recommendation that the feedercommunity colleges and CSUCI faculty assess curriculum realignment.All parties are enthusiastic and future meetings are planned to reassess the curriculaalignment in order to assist student progress in transfer and completion. It is noteworthyto look at why this is important and what
renewable resources, theprimary topic area of the REU. Data for the first two years of the program (10 students in 2016 and 9 in2017) are included in the analysis. In addition to the quantitative results from close-ended surveyquestions, the comments made by the students in response to open-ended questions, both in the focusgroup and on their surveys, provide additional insight into their reflections on the impact of the REU andtheir interest in the research topic and research in general.SatisfactionOverall, the students have been happy with the REU experience, and good post-site ratings for the firstyear became even better in the second year. These ratings are presented in Table 1. Students who gaverelatively lower ratings tended to be those who
differentiating factors like race, ethnicity and age can be thought of asthe future scope of this particular study.AcknowledgementThis material is supported by the National Science Foundation under DUE Grant Numbers1501952 and 1501938. Any opinions, findings, conclusions, or recommendations presented arethose of the authors and do not necessarily reflect the views of the National Science Foundation.References[1] Langdon, D., Mckittrick, G., Beede, D., Khan, B. & Doms, M., (2011). Stem: Good jobs now and for the future. Esa issue brief# 03-11. US Department of Commerce.[2] Carnevale, A.P., Smith, N. & Melton, M., (2011). Stem: Science technology engineering mathematics. Georgetown University Center on Education and the Workforce.[3
Qualitative Researchers, 2nd ed. Thousand Oaks: SAGE , 2012.[17] J. Walther, N. W. Sochacka, and N. Kellam, “Quality in Interpretive Engineering Education Research: Reflections on an Example Study,” J. Eng. Educ., vol. 102, no. 4, 2013.[18] L. K. Su, “Quantification of diversity in engineering higher education in the United States,” J. Women Minor. Sci. Eng., vol. 16, no. 2, 2010.[19] E. D. Tate and M. C. Linn, “How does identity shape the experiences of women of color engineering students?,” J. Sci. Educ. Technol., vol. 14, no. 5–6, pp. 483–493, 2005.[20] C. Hill, C. Corbett, and A. St Rose, Why So Few ? Women in science, technology, engineering and mathematics. Washington, DC: American Association of University Women
was supported with funding from the National Science Foundation. Any opinions,findings, and conclusions or recommendations expressed in this material are those of the authorsand do not necessarily reflect the views of the National Science Foundation.References[1] Arendale, D. (1997). SI (SI): Review of research concerning the effectiveness of SI from theUniversity of Missouri-Kansas City and other institutions from across the United States.[2] Dawson, P., van der Meer, J., Skalicky, J., & Cowley, K. (2014). “On the effectiveness of SI: Asystematic review of SI and peer-assisted study sessions literature between 2001 and 2010” Review ofEducational Research, 84 (4), 609–639.[3] Scott Steinbrink, Karinna M. Vernaza, Barry J. Brinkman
was above 4.0/5.0 across all topics in both manufacturingexcellence session and manufacturing quality excellence session [25]. That being said, averagescore for the non-destructive evaluation (NDE) module in Manufacturing Quality Excellencesession was slightly lower (approximately 3.75/5.0) than those for other modules. The lowerscore for NDE could be explained due to the larger amount and more technical nature of thelearning materials as reflected in the participant’s open-ended comments. In overall, the higherthan target (3.5/5.0) course evaluation scores demonstrated that the professional developmentsessions were able to meet course objectives in terms of renewing/enhancing participants’ HVMskills set.5. ConclusionsThe National Science
pertaining to female and minority hiring and participation. The unit of analysis is the transcript of each interview or focus group. Researchers will also calculate the extent of match between AM educators’ perceptions and AM standards/certifications as well as use established instruments to measure the extent to which the new professionals report entrepreneurial and intrapreneurial intentions [27-29].Sampling NoteRural NW Florida is highly diverse, with over 30% of residents reporting that they are black,Hispanic, or of multiple races; the enrollments of the participating state colleges reflect theircommunities. Because an intent of this project is to increase participation in AM education andcareers, the research team will reach out to
-Fitzpatrick and G. D. Hoople, “Cultivating an Entrepreneurial Mindset: An Interdisciplinary Approach Using Drones,” Advances in Engineering Education, vol. 7, no. 3, 2019. www.advances.asee.org/wp-content/uploads/vol07/issue03/Papers/AEE-25- Hoople.pdf15 G. D. Hoople, A. Choi-Fitzpatrick, and E. Reddy, “Drones for Good: Interdisciplinary Project Based Learning Between Engineering and Peace Studies,” International Journal of Engineering Education, vol. 35, no. 5, pp. 1378-1391, 2019. https://www.ijee.ie/latestissues/Vol35-5/12_ijee3801.pdf16 E. Reddy, G. D. Hoople, and A. Choi-Fitzpatrick, “Interdisciplinarity in Practice: Reflections on Drones as a Classroom Boundary Object,” Journal of Engineering Studies, vol. 11
lower than expected correction rates,indicating the necessity to enhance undergraduate solid mechanics education. Considering overallperformance by category provides additional evidence with regards to the limited understandingamong students on the multi-scale nature of materials and linkages to observed mechanicalbehavior and properties, Figure 5 (f). The collected student data indicates that although most ofthe students were able to identify the meaning of each keyword and categorize them properly inthe “materials processing” category (77% of students correctly categorized the keywordsbelonging to “materials processing” category), the macro-scale mechanics parameter resultsindicate significant misconceptions as reflected by the observation
even further.AcknowledgementsThis material was supported by the National Science Foundation’s Research Experience forUndergraduate Education (REU) Program (Award no. 1263293). Any opinions, findings, andconclusions or recommendations expressed in this material are those of the author and do notnecessarily reflect the views of the National Science Foundation.Bibliography[1] https://www.nsf.gov/pubs/2013/nsf13542/nsf13542.pdf[2] Brownell, J.E., and Swaner, L.E.. Five High-Impact Practices: Research on Learning, Outcomes, Completion, and Quality; Chapter 4: "Undergraduate Research." Washington, DC: Association of American Colleges and Universities, 2010.[3] Crowe, M., and Brakke, D. "Assessing the Impact of Undergraduate-Research