undergraduate students’working with potential supervisors on research projects actually demolishes theirchances to practice complex problem skills. 2) The competing demands on facultyactually temper teachers’ enthusiasm and input in developing CPS ability. In fact,after joining the Washington Agreement in 2016, Chinese universities haveaccelerated the process of engineering education reform [26]. Although China'sengineering education certification standards mandate the inclusion of CPScompetency as a graduation requirement, universities have not yet implemented aspecific training plan to meet these requirements due to various constraints. As aresult, teachers have no extrinsic motivation to focus on students' CPS competencycultivation in
. Raviv, W. Zhou and A. Shapira, "Safety by design: dialogues between designers and builders using virtual reality," Construction Management and Economics, vol. 33, p. 55–72, 2015.[19] D. Zhao and J. Lucas, "Virtual reality simulation for construction safety promotion," International journal of injury control and safety promotion, vol. 22, p. 57–67, 2015.[20] J. Goh, S. Hu and Y. Fang, "Human-in-the-loop simulation for crane lift planning in modular construction on-site assembly," in Computing in Civil Engineering 2019: Visualization, Information Modeling, and Simulation, American Society of Civil Engineers Reston, VA, 2019, p. 71–78.[21] P. Wang, P. Wu, H.-L. Chi and X. Li, "Adopting lean thinking in
analysis plan details instrument reliability and validation test.Data Collection and Analysis PlanData and evidence gathering for the needs assessment main study are ongoing. It involves facultyand students. A sample size of 200, or the ratio of the number of cases to the number of variablesof 10:1, has been described as a sufficient sample size for Exploratory factor analysis (EFA) andConfirmatory Factor Analysis (CFA) [16]. Recruitment emails have been sent through thecenter of excellence to the 5 EPUs. The current recruitment update shows that about 1000participants have been recruited. The survey has been administered through the mountain westinstitution Qualtrics website.For the reliability test, Cronbach alpha (> 0.7) and correlation
us to conduct research “with” and not “on”another, situating the Other as equals not to change the other but to change self [18].MethodsAnuli and Glory discovered that they had similar interests at an online conference and thereafterthe idea of writing a duoethnography emerged, they then invited Kelly to join the team, and webecame a trio-ethnography. Our research process started in July 2022 when we participated in a 5-week workshop where we worked with mentors to refine our research plan. Thereafter, we metevery Monday for an hour over 6 months to execute our study. Collaboration tools utilized wereZoom, Google Drive, WhatsApp, and emails. After our interview questions were drafted to guideour dialogue, we emailed them to a faculty member
necessary to work in a PBLenvironment. In this way the development of interpersonal, structural, task planning etc skillsbecomes an explicit part of our curriculum, which is assessed separately to the project basedcourses in which these skills are applied.The assessment task consists of an approximately 2000 word essay that is produced underexamination conditions. It is submitted electronically through the learning managementsystem Moodle. Each year around 1500 students complete this essay. The essay is co-marked by both PBL experts and department staff, with a total workload allocation of 20minutes per essay. The assignment is marked with a simple pass/fail determination, and noexplicit feedback beyond the pass/fail grade is provided to the
the extra credit awarded for completing the survey orother events affecting their feelings at the time of filling the survey.Some broad implications of this study are to develop effective tools for students to strengthentheir Information Gathering skills through various resources. In other words, how a decision canbe optimized with the benefit of reaching different people, using different processes andproducts. For example, if a student is required to decide towards selecting their majors, one ideacould be to reach out to different people (advisors from university and industry) and visualizingstep-by-step prospective career plans for students. Through such a holistic Information Gatheringprocess, advisors could be assured that students would
as part of their regularduty also allows courses to define standards for flexibilization and fairness between cases ofsimilar nature across different terms.Our early observations show that scaling this initiative to more large-enrolment courses may bechallenging. These challenges are associated with a high workload of the WTA. Increasing theratio of WTAs to the number of students has associated costs. Thus, we are currently studyingways to automate some of the tasks of the WTA, without sacrificing the human connection that isso key to the practice. As a next step, we plan to extend the practice to cover first-year courseslike Calculus, Linear Algebra, and Physics. Another key challenge is the engagement of teachingstaff; as mentioned above
]. Natural LanguageProcessing (NLP) uses machine learning methods like transformer-based machine learningmodels [7], [8], which can be used through fine-tuning or in-context learning methods. NLP canbe used to train algorithms that can automate the coding of written responses. Only a few studiesfor educational applications have leveraged transformer-based machine learning models, furtherprompting an investigation into its use in STEM education. However, since language analysis ischallenging to automate because of its complexity, NLP has been criticized for increasing thepossibility of perpetuating and amplifying harmful stereotypes and implicit biases [9], [10].This study details preliminary results to plan for using NLP for linguistic justice
. Seifert, A. L. Patalano, K. J. Hammond, and T. M. Converse, “Experience and expertise: The role of memory in planning for opportunities,” in Expertise in context: Human and machine, Menlo Park, CA: AAAI Press, 1997.[14] K. M. Martin, E. Miskioglu, C. Noble, A. McIntyre, C. S. Bolton, and A. Carberry, "Predicting and Evaluationg Engineering Problem Solving (PEEPS): Instrument Development," presented at the Research in Engineering Education Symposium & Australasian Association for Engineering Education Conference, Perth, Australia, 2021.[15] P. S. Steif and J. A. Dantzler, “A statics concept inventory: Development and Psychometric Analysis,” Journal of Engineering Education, vol. 94, no. 4, pp. 363–371
3 4 5 6 7 teams I like the objectivity of engineering education 1 2 3 4 5 6 71=Strongly Disagree, 2=Disagree, 3=Slightly Disagree, 4=Neutral, 5=Slightly Agree, 6=Agree,7=Strongly AgreeThe list of items is not final. Our ongoing research may direct us to add/remove or amend items.Our future work aims to further refine and psychometrically validate the EUSWQ. 4.1 PSYCHOMETRICS OF EUSWQ AND FUTURE WORKFor our future work, we are planning to validate the EUSWQ after presenting it to a larger numberof the undergraduate engineering student population. We aim to conduct two types ofpsychometric validation analysis. As part of the structural validity of the EUSWQ, exploratoryfactor analysis (EFA) will be conducted to verify
, ability or personal values essential information or present, or motivation to relative to knowledge/skill operations to make demonstrate perform quantitative with definitions, a decision, quantitative quantitative tasks information, equations, basic compare/contrast, information or operations, and quantitative build a model, concepts to an tasks. operations project, plan, etc. external or pseudo audienceTable 3: Coding indicators used to determine if a student
criteria) for and the search results from a pilot review (see Pilot Review),how the lessons learned gained from the pilot review were incorporated into refining the searchstrategy (see Lessons Learned and Search Strategy), and our future plans (see Future works).Pilot ReviewDue to the absence of a relevant a priori protocol registered with the Open Science Framework forthe current scoping review, a pilot scoping review was conducted to develop one. This pilot reviewalso aims to establish a systematic search strategy (e.g., a search string, search database, inclusioncriteria) to identify a broad range of primary literature, aligning with the goals of our scopingreview (Arksey & O’Malley, 2005). This process also served to enhance the research
transfer KSAs from one module to another, as there are many issues leftunanswered. As mentioned above, having the students consciously transfer what they knowfrom prior modules could academically challenge them more than usual; Any intervention ofa similar nature might face the same problem. A student having an improved attitude totransfer could find it harder to transfer what they know, thus creating more barriers. Thebalance between these two elements is a topic that is without a doubt important and needs tobe explored further. Ultimately, this intervention makes for a good starting point to increasethe transfer of learning behaviours in engineering students.Future Directions (Work-in-Progress)Moving forward, this research plans to conduct
often in contrast with students’desired learning experience, as further explained in the discussion.Survey Quantitative ResultsAs summarized in Table 2, all participants used laptop computers to access Ecampus coursematerials, and 48 of the 58 participants used their phone for coursework as well. Others also useddesktop computers (23 participants) and tablets (14 participants). For content accessed via a webbrowser, Chrome was the most common browser for engaging with Ecampus course material (37participants). Next were Firefox (12) and Safari (7), followed by one user for each of Edge andOpera. For the tablet and phone users, Wi-Fi was more common than using phone plan data forconnecting with course materials, but not all respondents used Wi-Fi
the experience in the way of fieldnotes after the observationis completed. In the context of engineering and STEM education, several observations protocolshave been developed to study teaching practices and instructional effectiveness. Below wedescribe some of the most commonly used observation protocols:Teaching Dimension Observation Protocol (TDOP). Based on the instructional systems-of-practice framework, the TDOP was developed to observe course planning and classroominstruction [5], [6]. The TDOP is broken down into six dimensions of practice: teaching methods,pedagogical strategies, cognitive demand, student-teacher interactions, student engagement, andinstructional technology. Each of these dimensions has between four and 13 individual
/10.3758/BF03197722Miskioğlu, E. E., Aaron, C., Bolton, C. S., Martin, K. M., Roth, M., Kavale, S. M., & Carberry, A. R. (2023). Situating Intuition in Engineering Practice. Journal of Engineering Education. https://doi.org/10.1002/jee.20521Reed, S. K. (2016). The structure of ill-structured (and well-structured) problems revisited. Educational Psychology Review, 28(4), 691-716. https://doi.org/10.1007/s10648-015-9343-1Salanda, J. (2021) The Coding Manual for Qualitative Researchers. SAGE Publications Ltd.Seifert, C. M., & Patalano, A. L., Hammond, K. J., & Converse, T. M. (1997). Experience and expertise: The role of memory in planning for opportunities. In P. J. Feltovich, K. M. Ford, & R. R. Hoffman (Eds
Engineering Experiences SurveyAbstractThis research paper presents validity evidence for a sophomore engineering experience surveythat provides an initial understanding of how sophomores experienced their second year ofengineering studies. While the sophomore year is a pivotal transition for engineering students,existing research and practices have largely overlooked this crucial period. There is a need toassess these students and understand more about their college experiences so interventions canbe planned and implemented. The primary aim of this research is to establish validity evidencefor the scales used in the Sophomore Engineering Experiences Survey (SEES). The survey wasadapted from Schreiner’s Sophomore Experiences Survey and guided by
current study addresses the following research questions: 1. What motivates students to attend scheduled class sessions with ungraded attendance? 2. Are there differences in motivating factors that depend on the structure of the class session (in this case, lecture versus laboratory)?This paper presents preliminary results from an end-of-semester survey, and discusses plans forrepeating the survey in a future offering of the course.MethodologyThe survey design was inspired by surveys of attendance in prior work [7], [8], [9], with theaddition of open-ended questions, consisting of the following prompts regarding lecture sessions: 1. Please estimate the percentage of lectures that you attended prior to the first exam. 2. Please
. Chase, “Engineering stress culture in project-based engineering programs,” in Proceedings of the 2022 Annual Conference of the American Society for Engineering Education, Minneapolis, MN, USA, June 2022.[16] S. Lovibond and P. Lovibond, Manual for the depression anxiety stress scales (2nd edition). Psychology Foundation, 1995.[17] P. M. H. S. Jones, B.D and T. Knott, “An analysis of motivation constructs with first-year engineering students: Relationships amongh expectancies, values, achievement, and career plans,” Journal of Engineering Education, vol. 99.[18] M. H. Lee, W.C and P. Brown, “Measuring underrepresented student perceptions of inclusion within engineering departments and universities,” International Journal
guidance in the planning and implementation of the intervention[9]–[14]. An initial development of a proactive advising survey instrument is reported. Surveyitems were drawn from two validated sources: the MMRE survey instrument[5] and theSUCCESS instrument[15], [16]. A concise short-form instrument is desired for the currentapplication to maximize the likelihood students will complete the entire survey. Since both theMMRE and SUCCESS instruments are relatively long, a subset of questions from theseinstruments is initially included. Seven questions were selected for each of the four constructs:self-efficacy, teamwork self-efficacy, engineering identity, and commitment to an engineeringcareer. Recognizing that the validity and reliability of
size of the interviews we conduct as well as initiating the interviews withother categories (employers, multiple universities). While our activities focus on academicmakerspaces, we plan to validate our data against the breadth of design and fabrication studios,including a variety of Makerspace operational structures in public/private institutions,community, and vocational colleges. After concluding both Activity 1 and 2 we will develop afinal report providing useful guidance on the value of investments in design and fabricationstudios for organizations who make education investment decisions.The tangible outcomes of this study – such as the specific forms of the developed tools forassessing makerspaces – will be more fully realized as the
needs to “retain” to be costeffective, and discussed the impact of the pandemic toward education. In a later study, theauthors will compare results from surveys sent to both students and faculty to gauge theeffectiveness of SI. With preliminary findings revealing University of South Alabama’s currentSI model to be highly effective for a university of its size. These results and our model have thepotential to lead other similarly sized universities to the same results. With future worksexpanding on the paper so that a plan for other universities is included. It is expected that theresults of the survey will show the positive correlation of SI services with the retention ofstudents. However, the impact of COVID on SI sessions is unknown
- and post-surveys, theresearch team utilized the Mann-Whitney U and Wilcoxon Signed-Rank tests to evaluate theimpact of the intervention. The findings demonstrated a significant enhancement in GTAs' skillsacross all surveyed domains, irrespective of their prior teaching experience. The study's resultsvalidate the survey instrument's utility in capturing the nuanced aspects of GTAs' pedagogicalgrowth and confirm the targeted course modules' efficacy in advancing their teaching andleadership proficiency. Plans for ongoing instrument refinement and the potential for broaderapplication underscore the study's significance in elevating GTA training effectiveness andpedagogical excellence.IntroductionGraduate Teaching Assistants (GTAs) in engineering
immediate andprospective indicators. We hope it helps in future conference planning and determining its impacton participants’ research, teaching, and professional development.Immediate indicators included: 1) experience metrics, such as experience with technical aspectsof the conference; 2) satisfaction metrics indicating the benefits of the conference for research,professional development, and teaching; and finally, 3) quality ratings, indicating the participants’overall experience at the conference.All immediate indicators showed high satisfaction rate of participants, with 35% rating it as 8 ona scale of 10, 28% rating it as 9, and about 12% rating it as 7 and 10. More than half of the surveyparticipants found the conference beneficial for
providing more scaffolding opportunitiesfor participant learning during week 2. Specifically, mentors not only made sure that participantsconduct lab exercises, but also explain reasons of why certain things do not work and explaintroubleshooting instructions. Mentors training was improved to expand upon their projectexposure to ensure they were able to explain the project development plan and ensure that everystudent in the team and the team as a whole understood the goals and were able to participate inthe project development.data collection techniques and measuresData collection consisted of three techniques: survey, reflection activity, and engineering identityformation assessment. Survey data were collected at 8 time points using established
as well because they only need to attend one event to get lots of information about many schools.In July, a call for participation followed through both the graduate program sub-committee andthe EECHA list. At that time, ten (10) schools had agreed to participate, and other schools weresolicited for participation. Commitment from the schools involved in the event included: ● Providing input on the timing of the event ● Attending a planning meeting for the event ● Providing input on the name of the event ● Providing input on the information collected from prospective students who register for the event ● Volunteering to participate in one of two panels for the event ○ Graduate school application tips
students toidentify themes that are of interest to traditional students and thus increase engagement inclassrooms. A semi-structured interview process and coding similar to the one done with NTESwill be conducted with traditional students.Finalize attributes and Leveraging NTES lived experienceUsing the results from both sets of interviews with NTES and traditional engineering students,we will finalize the attributes that both sets of students deemed of interest to them. The approachwe plan to leverage NTES lived experience is through the creation of a set of in-classcooperative learning activities as a proof of concept, then, developing the methodology to createsuch activities focusing on NTES lived experience that other instructors could use to
synthesize and identify patterns observed to date in how student background is associatedwith engineering identity and career plans, from 60+ students participating in mentored materials engineering research across fivecohorts. Using individual and focus group interviews to investigate intersectional experiences of students, we engage engineering students’counter-stories in context, from a critical, social justice perspective attending to multiple axes of identity. Across our analyses, we findevidence that stable and consistent support fosters and sustains engineering identity, sense of belonging, and career ambitions.Implications are offered with respect to programmatic, research, and policy directions. Keywords— External evaluation, Inclusivity
) coordinates, along with a presence value. Allfeatures scaled to the [0,1] range, resulting in a total of 91 normalized affect features. To enablereal-time prediction of user-specific performance, the system undergoes initial tuning. Duringinitial interactions, sequences of affect features and corresponding scores are collected. If theseinteractions take the form of lessons, the student's video is segmented into sequences,delineated by in-lesson assessments. Affect sequences and scores are used to tune theperformance predictor, and we plan to both train and test Transformers, Recurrent NeuralNetworks (RNN) and Long Short-Term Memory (LSTM) models. After adequate tuning, theperformance predictor becomes operational in subsequent sessions. Consequently
encouragestudents to discuss their predictions of what will happen with their peers, rather than justanswering with iClicker, as this has been shown to further improve student learning [8, 14].Lastly, we plan to reshoot some of these videos utilizing best practices to improve theireffectiveness, such as showing demonstrations from a first-person perspective [14], writing outkey information as the demonstration is given rather than just displaying it [15], and focusing onvisual tabletop demonstrations [16]. We believe that these changes can further improve thequality of demonstration videos to improve the overall educational experience of our students byproviding high quality, exciting demonstrations to them in a course where they previously didnot have