important step towards regularization ofthese topics in education. At the same time, we recognize that a cultural shift needs to occur forengineering educators to both feel comfortable and equipped to teach decolonial systems design,as well as having the tools to effectively do so.References[1] S. Winberg and C. Winberg, “Using a social justice approach to decolonize an engineering curriculum,” IEEE Global Engineering Education Conference, EDUCON, pp. 248–254, Jun. 2017, doi: 10.1109/EDUCON.2017.7942855.[2] D. G. Carmichael, “Bias and decision making – an overview systems explanation,” Civil Engineering and Environmental Systems, vol. 37, no. 1–2, pp. 48–61, Apr. 2020, doi: 10.1080/10286608.2020.1744133.[3] M. Agyemang, D
technological policy development, stakeholder voices and the intertwined cultural, social, and political impacts. My dissertation focused on policy design processes for automated driving systems (ADS). ©American Society for Engineering Education, 2025 Engineering U. S. Responsible AI Policy, A Survey, 2020-2025AbstractThe increase in public access to large-scale AI and the enormous variety of current and potentialapplications has created widespread excitement and sparked concern over unknown andunintended consequences. While AIs rapidly advance into useful tools across broad applications,we do not yet understand AIs’ potential harms, social impacts, and outcomes. The public isincreasingly using free AI
Paper ID #47453Engineering Connection: Growing Sustainable Outreach for Graduate StudentsSara C. Kern, Pennsylvania State University Sara Kern (she/her) is an Engineering Librarian at Penn State University. She earned her MA in history from Penn State and her MSLIS at Syracuse University. Her research interests include inclusive library outreach and instruction.Ms. Denise Amanda Wetzel, Pennsylvania State University Denise A. Wetzel is the Eric N. and Bonnie S. Prystowsky Early Career Science Libraries Professor and Science & Engineering Librarian at Pennsylvania State University Libraries. She is also the Patent and
change.The interview data indicates that larger and more well-established disciplines and departments,particularly those at R1 institutions, may be more heavily siloed, prone to taking on an“institutional mantle” that prioritizes preservation, and have strongly embedded scholar-academic belief systems [24]. For example, one participant mentioned characteristics such as“authoritative textbook[s],” the “old guard” that consists of “long-standing, really successful andinfluential people” and “legendary figures,” and a sense of “history and heritage.” Anotherparticipant reflected that their colleagues who were invested in traditional approaches “genuinelywere opposed to these changes more philosophically.” One changemaker described how suchqualities can
, which will structure the results:STEM-Related Skills and Content (Table 2); Student Feelings, Attitudes, Agency (Table 3);People, Community, and Social Aspects (Table 4); Characteristics of the Course (Table 5); andOther (Table 5). As can be seen in Figure 1, students’ responses were most frequently coded oneor more times for Characteristics of the Course (47% of codes); 37% of codes referencedSTEM-related skills and content. Eight percent of codes related to people, community, or othersocial aspects as what they liked best about the e4usa class. Four percent of codes discussedstudents’ feelings, attitudes, or agency, and 4% of codes were “other,” most of which were noresponse.To code question 4 regarding the student’s desired profession(s
objective of the study. It is also planned to use a combination of analytical anddescriptive surveys after each evaluation to assess how the applied teaching methods influenceacademic integrity, and course-specific surveys will help determine which activities are mosteffective. By incorporating a control group using traditional teaching methods, it will be possibleto compare these factors and assess whether the new approaches benefit students.References[1] J. Acosta and M. A. Guerra, “Validating Guerra’s Blended Flexible Learning framework for Engineering Courses,” in 2022 ASEE Annual Conference & Exposition, 2022. Accessed: Apr. 29, 2025. [Online]. Available: https://peer.asee.org/validating-guerra-s-blended-flexible- learning
instance, C1 performed near the average in Milestone5 and below average in Milestone 6, yet the team maintained a united approach throughout, reflectingtheir commitment to equity.Equity Concerns: F4, F7In contrast, teams F4 and F7 displayed consistently high grades with near-zero deviations in suggestedadjustments, raising potential concerns about collusion in the peer review process. F7’s dramatic drop inperformance during Milestone 3, while maintaining no deviations in peer review data, may indicate aprearranged agreement among members. F4 presents a subtler case, with no single milestone showingsignificant performance deviation that might reveal team inequities hidden by internal agreement, evenshowing a later increase in performance. These
and monitoring system for The limited space for item storage for [Team Y] is 2 3D printers to streamline operations, enhance shared use causing significant difficulties and constraints. and boost efficiency. Engage undergraduate students in teaching and There is insufficient space allocated for storing 3 operational roles to foster a culture of knowledge-sharing [Team Z]'s project archives. and hands-on experience
://www.ren21.net/wp-content/uploads/2018/06/17- 8652_GSR2018_FullReport_web_final_.pdf. 4. Hsiang, S., R. Kopp, A. Jina, J. Rising, M. Delgado, S. Mohan, D. Rasmussen, R. Muir-Wood, P. Wilson, and M. Oppenheimer, Estimating economic damage from climate change in the United States. Science, 2017. 356(6345): p. 1362-1369. 5. Vicente-Molina, M.A., A. Fernández-Sáinz, and J. Izagirre-Olaizola, Environmental knowledge and other variables affecting pro-environmental behavior: comparison of university students from emerging and advanced countries. Journal of Cleaner Production, 2013. 61: p. 130-138. 6. Meyer, A., Heterogeneity in the preferences and pro-environmental behavior of college students: the effects of years
public institutions are atAssociates, Baccalaureate, or Master’sColleges & Universities.Student feedbackStudent feedback from the online course Figure 9. Map of United States and U. S. territories. There arerevealed that the hands-on activities “lost 37 Newton’s Team Participants distributed at institutions acrosstheir educational impact when we weren't 19 states and territories (green). Created with mapchart.net.in a classroom.” However, it is challenging to distinguish between the students’ views on thehands-on activities and the students’ broader dissatisfaction with the sudden shift to onlinelearning during the pandemic. Comments from that semester included statements like, “Difficultto pay attention when always outside of a
proved difficult for participants to understand, set up, and share with thegroup in a virtual environment. To adjust this activity, the learning objective was first identified:practice writing, testing, and revising an algorithm. Then, a range of activities was considered toallow participants to practice algorithm development. Ultimately, we settled on the “write analgorithm to make a peanut butter and jelly sandwich” because it allowed for a participant’salgorithm to be tested very visibly, through demonstration. Some considerations to make whenadjusting an activity for a virtual environment: • Learning objective(s): Identify the activity’s goal. In informal settings, these may not be as clear as they may be in a formal learning
Van Treuren (BaylorUniversity) and funding from the Kern Family Foundation.References:[1] A. L. Zydney, J. S. Bennett, A. Shahid, and K. W. Bauer, “Impact of Undergraduate Research Experience in Engineering,” Journal of Engineering Education, vol. 91, no. 2, pp. 151–157, 2002, doi: 10.1002/j.2168-9830.2002.tb00687.x.[2] A.-B. Hunter, S. L. Laursen, and E. Seymour, “Becoming a scientist: The role of undergraduate research in students’ cognitive, personal, and professional development,” Science Education, vol. 91, no. 1, pp. 36–74, 2007, doi: 10.1002/sce.20173.[3] D. Lopatto, “Survey of Undergraduate Research Experiences (SURE): First Findings,” CBE, vol. 3, no. 4, pp. 270–277, Dec. 2004, doi: 10.1187/cbe.04-07-0045.[4
Meritocracy Hinder Engineers’ Ability to Think About Social Injustices,” in Engineering Education for Social Justice: Critical Explorations and Opportunities, J. Lucena, Ed., Dordrecht: Springer Netherlands, 2013, pp. 67–84. doi: 10.1007/978-94-007-6350-0_4.[3] J. A. Leydens and J. C. Lucena, Eds., “Social Justice is Often Invisible in Engineering Education and Practice,” in Engineering Justice: Transforming Engineering Education and Practice, 1st ed., Wiley, 2017, pp. 45–66. doi: 10.1002/9781118757369.ch1.[4] A. Amer, G. Sidhu, M. I. R. Alvarez, J. A. L. Ramos, and S. Srinivasan, “Equity, Diversity, and Inclusion Strategies in Engineering and Computer Science,” Educ. Sci., vol. 14, no. 1, Jan., 2024 Art. no. 1, doi: 10.3390
, precision, recall, and F1-score toassess reliability. The results indicated that this dictionary-based approach effectively identifiedand categorized student perspectives, enabling scalable analysis of ethical reasoning withinMars!'s decision-making framework (See Table 5 and Figure 4).Table 5: Dictionary-Based Perspective Classifier Metrics: Accuracy Precision Recall F1 Score 0.9598 0.9767 0.9598 0.9643Figure 4: Confusion Matrix for Dictionary-Based Perspective Classifier PredictionsThe dictionary-based classifier successfully distinguishes between first-person, second-person,and third-person perspectives, with high accuracy in clear-cut
.10983.[6] I. Lawal and M. England, “One size does not fit all: common practices for standards collections and management,” Issues in Science and Technology Librarianship, vol. 102, 2023.[7] M. Phillips, M. Fosmire, L. Turner, K. Petersheim, and J. Lu, “comparing the information needs and experiences of undergraduate students and practicing engineers,” The Journal of Academic Librarianship, vol. 45. pp. 39-49, 2019.[8] American National Standards Institute. (2020). “United States standards strategy,” 2020. [Online]. Available: https://share.ansi.org/Shared%20Documents/Standards%20Activities/NSSC/USSS2020/USS S-2020-Edition.p[9] J. Baron, J. L. Contreras, M. Husovec, P. Larouche, and N. Thumm, “Making the rules: the
Engineering Dept. Heads Assoc. (ECEDHA)Dr. Bruk T Berhane, Florida International University ©American Society for Engineering Education, 2025 Paper ID #47866 Dr. Bruk T. Berhane received his bachelorˆa C™s degree in electrical engineering from the University of Maryland in 2003. He then completed a masterˆa C™s degree in engineering management at George Washington University in 2007. In 2016, he earned a PhProf. Petru Andrei, Florida A&M University - Florida State University Dr. Petru Andrei is Professor in the Department of Electrical and Computer Engineering at the Florida A&M University and Florida
development is limited; there is an emphasis on designing learningexperiences that can be efficiently autograded. Homework assignments are therefore broken upinto many subparts.Beyond summative midterm and final exams, students also complete two courseprojects—longer, multi-week individual explorations. These course projects are similar toin-depth homework assignments because of their rigid structure and build student understandingof the data science modeling tasks in the course: regression and classification.Curriculum Process and TimelineGiven the larger investment that students have in Data 100’s multi-week projects, we firstfocused our curriculum development on these assignments and branched out to other coursecomponents like lecture and