shouldexplore the outcomes of women graduate students who benefit from bonding and bridgingcapital provided through S-STEM programs beyond their time in graduate school. While thisstudy investigated women students currently enrolled in a graduate program, a longitudinal studycould help to understand the long-term impact of these programs after degree completion. 7Acknowledgment: This material is based upon work supported by the National ScienceFoundation S-STEM Program under Grant No. 1930451. Any opinions, findings, andconclusions or recommendations expressed in this material are those of the author(s) and do notnecessarily reflect the views of the
what modifications are required, through end-of-course/workshop surveysand evaluations. For each of these surveys and evaluations, a standard rubric was prepared andprovided to the participants with consultation with the EAC members to properly reflect theproject activity objectives. These formative and summative measures are listed in Table 2. Table 2. Evaluation plan including formative (F) and summative (S) measures. Activity Description Evaluation Measure Continuous consultation and feedback from (i) New course and laboratory External Advisory Committee (F & S); Early and end-of-term
engineering education research to assess socio-emotional and cognitiveoutcomes. Additional work includes the investigation of epistemic insights gained by participants regardingimplanting AI in the K-12 environment.VI. Acknowledgment and DisclaimerThis material is based upon work supported by the National Science Foundation under Grant No. 2147625.Any opinions, findings, and conclusions or recommendations expressed in this material are those of theauthor(s) and do not necessarily reflect the views of the National Science Foundation.VII. References[1] C. Grant, B.J. MacFadden, P. Antonenko, and V. Perez, “3D Fossils for K-12 Education: A Case Example Using the Giant Extinct Shark Carcharocles Megalodon,” Paleontological Society Papers
understandand interrogate the programmatic barriers to student success in engineering across the nation willalso expand – leading to a cornucopia of previously unexplored questions at scale. AcknowledgmentsThis material is based upon work supported by the National Science Foundation under Grant No.BPE- 2152441. Any opinions, findings, and conclusions or recommendations expressed in thismaterial are those of the authors and do not necessarily reflect the views of the National ScienceFoundation. References[1] F. Curry and J. DeBoer, “A Systematized Literature Review of the Factors that Predict the Retention of Racially Minoritized Students in STEM Graduate
, Journal of Vocational Behavior, 68(1), pp. 73-84, 2006.22. J.C. Dunlap, Using guided reflective journaling activities to capture students’ changing perceptions, TechTrends, 50(6), pp. 20-26, 2006.23. H. Rimm and M. Jerusalem, Adaptation and validation of an estonian version of the general self-efficacy scale (ESES), Anxiety, Stress, & Coping, 12(3), pp. 329-345, 1999.24. R. Likert, S. Roslow, and G. Murphy, A Simple and Reliable Method of Scoring the Thurstone Attitude Scales, Journal of Social Psychology, 5, pp. 228-238-238, 1934.25. R. DeHaan, R. Hanford, K. Kinlaw, D. Philler, and J. Snarey, Promoting ethical reasoning, affect and behaviour among high school students: An evaluation of three teaching
the transferrable skills course in their resume andprovided examples of how they had demonstrated skill attainment: “I'm looking for a job rightnow, and I was able to list that as I was trained. It’s been extremely helpful.” Another Cohort 1student commented that the transferable skills and the interdisciplinary aspect of the NRT hadprompted a conversation in which a potential employer emphasized the need for such skills:“He's just like ‘that's really major right now that you already understand trying to connect withother people from different backgrounds and different perspectives to work together to try to getsomething done’.” When Cohort 1 students were prompted to reflect on what additional supportsto promote development in inter
). We expect thatour work will inform future efforts to moderate behaviors and team dynamics throughinterventions such as conflict management and self-advocacy.AcknowledgmentsThis work was supported by the National Science Foundation’s Research Initiation inEngineering Formation (RIEF) program under Grant No. 2106322. Any opinions, findings,conclusions, or recommendations expressed in this material are those of the authors and do notnecessarily reflect the views of NSF. We also acknowledge the work of Ana Biviano, a graduateresearcher on this project. We thank anonymous reviewers to an earlier draft of this manuscript.References 5Aragon O., Pietri E. and Powell B. (2023) Gender bias in teaching
Endeavour staff was experiencing in and out of the classroom. Also, the researchers felt thatthe high frequency of the survey delivery (five times over the two-year period of the program)was leading the students to not reflect on the survey questions as deeply as was desired sincethey had seen the questions so many times before. Therefore, modifications were continuouslybeing made to the original study design with the first three cohorts (e.g., a shift to focus groupsas opposed to Liker-scale surveys). Although the initial survey data would still prove useful forachieving specific aim 3 (an engagement dashboard), engagement measures have since moved tomore qualitative methods of data collection [8]. Work is still being done by the staff to pull in
(grant number2034800). Any opinions, findings, and conclusions or recommendations expressed in thismaterial are those of the authors and do not necessarily reflect the views of the National ScienceFoundation. The authors thank our project evaluator Dr. Elizabeth Litzler and advisory boardmember Diana Gonzalez for their support and guidance on this project. The authors also thankthe Year 2 participants for supporting this work by sharing their experiences in our survey.References[1] T. M. Evans, L. Bira, J. Beltran-Gastelum, L. T. Weiss, and N. L. Vanderford, “Evidence for a mental health crisis in graduate education,” The FASEB Journal, vol. 36, pp. 282- 284, 2018.[2] A. K. Flatt, “A Suffering Generation: Six factors
in the Journals: Publication Patterns in Political Science,” PS: Political Science & Politics, vol. 50, no. 2, pp. 433–447, Apr. 2017, doi: 10.1017/S1049096516002985.[22] P. Chakravartty, R. Kuo, V. Grubbs, and C. McIlwain, “#CommunicationSoWhite,” Journal of Communication, vol. 68, no. 2, pp. 254–266, Apr. 2018, doi: 10.1093/joc/jqy003.[23] L. Urrieta, L. Méndez, and E. Rodríguez, “‘A moving target’: a critical race analysis of Latina/o faculty experiences, perspectives, and reflections on the tenure and promotion process,” International Journal of Qualitative Studies in Education, vol. 28, no. 10, pp. 1149–1168, Nov. 2015, doi: 10.1080/09518398.2014.974715.[24] A. A. Berhe et al., “Scientists from
this paper are those of the authors and do not, necessarily, reflect those of the National ScienceFoundation (NSF).References[1] J. Njock Libii, “Building an Infrastructure to Enhance and Sustain the Success of STEM Majors Who are Commuting Students,” presented at 2018 ASEE Annual Conference & Exposition, Salt Lake City, Utah, USA, June 2018. 10.18260/1-2. Paper# 30128.[2] Indiana Commission for Higher Education College Completion Reports, 2022. [online] https://www.in.gov/che/files/2022_College_Completion_Report_10_03_2022.pdf.[3] National Center for Education Statistics, “Undergraduate Retention and Graduation Rates,” Condition of Education. U.S. Department of Education, Institute of Education Sciences
EPRA evaluates theirattitudes to social responsibility. But our analysis has a current gap in that we have not yetassessed differences in student work displaying their ethical reasoning on the problems of thecourse. The use of the PM evaluations will address this gap and evaluate ethical achievement onthe specific projects the courses were designed to prepare them for.AcknowledgementsThis material is based upon work supported by the National Science Foundation, specifically theDivision of Undergraduate Engineering in the Directorate for STEM Education, under Grant No.2020560. Any opinions, findings, conclusions, or recommendations expressed in this material arethose of the authors and do not necessarily reflect the views of the National
ability to interact effectivelywith people from different cultural backgrounds were measured using a standardized surveyinstrument. Participants reported an increase in their average research competency ratings aftercompleting the program, as indicated by the survey findings. Those improvements cut acrossdemographics such as gender, race/ethnicity, socioeconomic status, and school type.Acknowledgments: This work was supported by the National Science Foundation’s InternationalExperience for Students (IRES) Site grant. (Grant Numbers: OISE# 1952490-TAMU, 2208801-NCAT,and 195249-UNLV). Any opinions, findings, conclusions, or recommendations presented are those of theauthors and do not necessarily reflect the views of the National Science Foundation
upon work supported by the National Science Foundation under Grant No.1848498. Any opinions, findings, and conclusions or recommendations expressed in this materialare those of the authors and do not necessarily reflect the views of the National ScienceFoundation. The authors wish to thank Dr. Elizabeth Litzler, the Project Evaluator, for hervaluable input, and Hannah Chiou for her assistance in reviewing codes. Additionally, we thankthe students, advisors and faculty who participated in the study for sharing their experiences.References[1] M. T. Cardador, "Promoted up but also out? The unintended consequences of increasing women’s representation in managerial roles in engineering," Organization Science, vol. 28, pp. 597-617
pedagogy.AcknowledgementsThis material is based upon work supported by the National Science Foundation under Grant No.1915614. The opinions, findings, and conclusions or recommendations expressed are those of theauthor(s) and do not necessarily reflect the views of the National Science Foundation.References[1]. Evaluation Consortium, University at Albany (2016) Experimental Centric Based EngineeringCurriculum for HBCUs Leadership Team, HBCU Year Three Report.[2]. Gough, A., & Gough, N. (2018). Beyond Tinkering and Tailoring: Re-de/signingMethodologies in STEM Education. Canadian Journal of Science, Mathematics & TechnologyEducation, 18(3), 284–290.[3]. Astatke, Y., & Connor, K. A., Newman, D., Attia, J. O., & Nare, O. E. (2016, June),Growing
studentexperience and what their experience has been like working on their research projects. Theinformation that is collected is used by the program staff to make any changes in mentor/menteeassignments and consider what additional programming might be needed for the participants.Participants also participate in a focus group interview with an external evaluator. Questions askparticipants to reflect on their experiences during the summer program, how the program hasimpacted their career and academic goals, and how the experience has developed confidence indifferent research skills.Data AnalysisA subset of eleven questions that considered students’ overall satisfaction, confidence, and self-efficacy in their research skills were considered for this study
success, we will develop and test interventions that develop these beneficial beliefs and attitudes in students. • Continue to work closely with our collaborating institutions (Purdue and UTEP) to develop and pilot test initiatives as a means of changing NCA factors for students to improve student success.AcknowledgementThis material is based upon work supported by the National Science Foundation under grantnumbers DUE-1626287 (Purdue University), DUE-1626148 (Cal Poly), and DUE-1626185(University of Texas – El Paso). Any opinions, findings, and conclusions or recommendationsexpressed in this material are those of the author(s) and do not necessarily reflect the views ofthe National Science Foundation. We
takeplace online in October or November of 2021. Instructors and students will complete anothersurvey, after instructors attend the workshop, and instructors will again complete a follow-upsurvey in the spring of 2022.AcknowledgementsThis research is supported by the U.S. National Science Foundation (grant numbers DUE-1821092, DUE-1821036, DUE-1821488, and DUE-1821277).Any opinions, findings, and conclusions or recommendations expressed in this material are thoseof the author(s) and do not necessarily reflect the views of the National Science Foundation.References[1] M. Prince, “Does active learning work? A review of the research,” Journal of Engineering Education, vol. 93, pp. 223-232, July, 2004, doi: 10.1002/j.2168-9830.2004.tb00809.x.[2
services they need to succeed. As faculty,we need to be advocate and champion for talented students who have been impacted bycatastrophic event if we want to retain and graduate them to become successful STEMprofessionals.AcknowledgementsThis material is based upon work supported by the National Science Foundation under GrantsNo. 1354156 (Nanotechnology Center); 1833989 (EECOS); 1833869 (PEARLS); 1832468 and1832427 (RISE-UP). Any opinions, findings, and conclusions or recommendations expressed inthis material are those of the authors and do not necessarily reflect the views of the NationalScience Foundation. The authors are greatly thankful to the advisory board members andevaluators for their valuable input and feedback. We are also greatly
increased byexperiences, there were two participants who had experienced a decrease in confidence, or whosaw that in others. The decreases in confidence were only reported by direct pathway students.One direct pathway student reported that an internship had impacted “the level of confidencewith which I proclaim results,” but in a way that reflected less confidence instead of more. Hereported having given specific numerical results as “a figure of speech”, and after beingchallenged on that in the workplace, changed his approach. As he stated, I throw a lot of disclaimers before I give specific numbers now because unless you have data to back it up, people will latch onto the numbers and then when it comes back and it’s only a 40
and students. 3In this first experience, we and the teacher coordinated several lessons in which we used freesoftware to introduce the science of waves. After this introduction, students developed a projectin which they created sound installations and reflected on how their installations vibrated andgenerated sound [2].We used free sound editor Audacity [11] for students to visualize waveforms (as anoscilloscope), create pure tones (as a signal generator), and create sound compositions. For aspectrogram, we used free software UltimaSound (See Figure 1). Using and installing thesoftware on the school’s computers was possible because we were
Junior Year Participant Comparison Discussion and Conclusions We are grateful to the National Science Foundation for supporting the SustainableBridges project. Please note that any opinions, findings, and conclusions or recommendationsexpressed in this material are those of the authors and do not necessarily reflect the views of theNational Science Foundation. The data presented here on the first three cohorts of theEngineering Ahead first-year bridge program for pre-major Engineering students is part of thelarger Sustainable Bridges project (#1525367). The preliminary results are promising for the first three cohorts of the first-year
high stakes nature of placement tests and do not adequatelyprepare for them (Avery & Kane, 2004; Safran & Visher, 2010; Venezia, Bracco, & Nodine,2010). This suggests that placement test results may be an inaccurate reflection of students’ mathskills and knowledge and should be interpreted with some caution. Third, faculty andadministrators typically use standardized tests as enrollment management tools in ways thatincrease the number of students in remedial classes both because they believe it reduces variationin academic preparation of students in the higher level classes and also because it is easier to hirestaff to teach at lower levels (Melguizo, Kosiewicz, Prather, & Bos, 2014). If this is true, thenstudents may be
Department Head who sees this as the top priority.The traditional approach to measuring diversity in engineering involves counting racial andethnic minorities and women, while measuring gains in representation as reflected by thenumbers. We believe that this traditional approach needs to consider other important aspects ofdiversity, in addition to the traditional approaches, to maximize the inclusiveness within thefield. Decades of educational policy and practice have under-considered the existence of groupssuch as LGBTQ, poor, and disabled, thereby perpetuating exclusionary social patterns (Riley etal., 2014). Our multi-pronged approach to increasing diversity and inclusion begins withexpanding the fundamental definition of diversity to include
for all students in CBEE?Ultimately, we aspire to both transform the activities systems in CBEE and to serve as a modelfor others in engineering education as we move towards an inclusive and creative engineeringprofession for the 21st Century.AcknowledgementsThe authors are grateful for the enthusiasm and participation in our work from so many membersof our CBEE School community – students, staff, and faculty. We also acknowledge the supportprovided by the National Science Foundation through grant EEC 1519467. Any opinions,findings, and conclusions or recommendations expressed in this material are those of the authorsand do not necessarily reflect the views of the National Science Foundation.References1. Engeström, Y. (2001). Expansive
rate of scholars (losses due to GPA, only) will also be assessed for evidence ofsuccessful interventions.AcknowledgmentThis work is supported by the National Science Foundation Award under Grant No. 1153250.Any opinions, findings, and conclusions or recommendations expressed in this material are thoseof the authors and do not necessarily reflect the views of the National Science Foundation.References [1] Geisinger, Brandi N. and Raman, D. R., “Why They Leave: Understanding Student Attrition from Engineering Majors,” International Journal of Engineering Education (1993): 29 (4), 914–925. [2] Chen, Xianglei and Soldner, Matthew, “STEM Attrition: College Students’ Paths Into and Out of STEM Fields,” National Center For Education
the sophomoreyear may work better for students once they understand, from the year-long counseling sessions,the need to catch up with their cohort. Unfortunately, participation in the summer bridge has notincreased significantly to date.As we reflect on the overall assessment plan, we realize that while some Program elements havethorough assessments, we need to disaggregate the data even more so that we better understandthe various cause and effect relationships.Initial ConclusionsWhile there are some promising initial results in terms of 1st to 3rd semester retention rates, it isclear that participation in the Program elements that help students catch up academically hasbeen low. Since implementation, we made several changes to the Program
college levelduring the 2014-15 academic year. The number of student-hours of instruction delivered at thefour-year level was double that delivered by community colleges and may reflect a greater abilityto apply the technology or the need for greater depth of instruction at the four-year level.The gender data shows that females are a distinct minority in microcontroller classes and that theclass is composed mainly of students of Caucasian ancestry. Students of Hispanic andAsian/Pacific Islander ancestry make up a higher percentage at the four-year level than in two-year community college microcontroller classes.Interest in professional development workshops similar to those offered through the projectseems to remain high. Registrations are
that we can measure the learningexperiences and outcomes in these 4 courses. Below are the evaluation results.Pre-EvaluationAll the participants are students from the computer science department at Georgia State University. Theassessment is divided into three parts: Work experience with computer and programming language used (written response) Knowledge of operating system (choice question) Study experience of PC and different ways to learn (choice question)The diversity in the nature of question reflects both the understanding of students about the operating system andthe best way for the students to learn it effectively.Written response – Operating System:Work Experience YES (%) NO (%)Have
Tutor showed a statistically significantadvantage for the post-test scores on node analysis [t(64) = 3.09, p < 0.05] with an effect size(Cohen d-value) of 0.72σ. For mesh analysis, the difference was not statistically significant [t(64)= 0.88, p = 0.38], which may reflect the fundamentally easier nature of that topic (both groupshad relatively high averages). The survey results showed a very strong preference for CircuitTutor and a strong belief that it taught them more effectively than System X. A typical studentcomment was “I liked Circuit Tutor more because I could do a ton of problems. I liked that evenif I couldn't figure it out, I could ‘give up’; and it would thoroughly explain how to do everythingso I could understand what I did