State University, and a PhD student in the Woodruff School of Mechanical Engineering at Georgia Insti- tute of Technology. Ancalle earned a B.S. from the University of Puerto Rico at Mayaguez and a M.S. from the University of Illinois at Urbana-Champaign, both in civil engineering. He has a passion for teaching undergraduate engineering courses, which has driven his teaching career for the past six years. He recently began working in the area of Engineering Education and plans to continue this path after completing his graduate studies. ©American Society for Engineering Education, 2023 Validity evidence for measures of statistical reasoning and statistical
. 2017. doi: 10.17226/24622.[4] D. H. Kinkel and S. E. Henke, “Impact of Undergraduate Research on Academic Performance, Educational Planning, and Career Development,” Journal of Natural Resources and Life Sciences Education, vol. 35, no. 1, pp. 194–201, 2006, doi: 10.2134/jnrlse2006.0194.[5] R. Taraban and E. Logue, “Academic factors that affect undergraduate research experiences,” Journal of Educational Psychology, vol. 104, no. 2, pp. 499–514, 2012, doi: 10.1037/a0026851.[6] S. Baron, P. Brown, T. Cumming, and M. Mengeling, “The Impact of Undergraduate Research and Student Characteristics on Student Success Metrics at an Urban, Minority Serving, Commuter, Public Institution,” Publications and Research, Apr. 2020
constructed objectbetween both activity systems, such as a care plan agreed upon by doctor and patient [17]. Figure 1. Third generation cultural-historical activity theory (CHAT) [17].To align with the Findings from the study reported here, we have adapted Engeström’s frameworkto fit our educational context, as shown in Figure 2. The faculty and student activity systems eachare comprised of the six interacting elements discussed above, similar to those shown in Figure 1.The subjects of the faculty activity system are Ash and Birch, the two faculty instructors weinterviewed, and their community includes other faculty, student instructors, staff support, andothers. The subjects of the student activity system are students taking both Ash’s and
bricolage into the makerspace literature is usedto argue for making experiences that are not completely planned, which corroborates the idea ofconstructionism that the planned and the unplanned are both essential parts of the learningexperience [58].ConclusionsIn this paper, we presented the process and outcomes for the creation of operational definitionsfor aspects of learning within makerspaces, with the goal of aiding the development ofassessment instruments. We first established the importance of having a solid theoretical basis toexplore the different nuances of learning in makerspaces, which we accomplished through theuse of the Learning Through Making Typology. Through the cooperation of a team that includedsome of the proponents of the
with learning outcomesin the end-of-quarter assessment. The correlation difference between the early assessment andlater assessment outcome may be due to the fact that impact of the explanatory learning activitiesneeds some time to build up on students. Overall, students perceived the new learning activitiesvery positively. In the cognitive aspect, students acknowledged that the explanatory learningactivities encouraged them to think about the deeper structure of the homework problem, to domore planning before solving the problem rather than rushing, to review lecture materials/textbook rather than just rush to complete the homework. These learning behvaiors changes arecharacteristic of deeper learning.There is a limitation in this study
lectures to the end of identifying andmodeling systems requirements in addition to developing security plans and implementing adatabase.Intercultural Competence InterventionInorder to help students understand the need and importance of intercultural competence, twoportable intercultural modules (PIM) focused on intercultural competencies were integrated intothe course. These PIMs were introduced in the 6th and 10th week of the semester, respectively.Both PIMs required students to view videos and complete activities and readings, followingwhich their knowledge was tested in the form of quizzes and written reflections. The two specificPIMs that were utilized in this course, and they are titled “Productive Conflict” and “TrickyCommunication: Intent
include religion, age, gender, etc. [8, 9].Although models using these predictors yield somewhat accurate results, they don’t consider thestudents’ work ethic or study habits. Therefore, we plan to factor in students’ efforts whenpredicting their course performance.One of the best ways to measure how much a student cares about their academic performance isto analyze their participation in the class [1, 10, 11]. A discussion forum is a platform that enablesstudents to seek help from their peers and instructors. Multiple studies have focused on producingand analyzing the statistical correlation between discussion forum data and student courseperformance [11, 12, 13]. While statistical correlations can benefit inference, student
needs. As such, program leaders must work to (1) provideeffective, accurate, and personalized support; and (2) provide information and recommendationsfor curricular developments and resource management. Both efforts rely on a strong foundationof data to inform decision-making. As such, this paper describes the quantitative portion of alarger mixed-methods project, from which the authors identified initial baseline conditions ofstudents’ academic performance in the focal course and revealed potential influential factors asrevealed in a logistic regression model predicting the likelihood of a student to receive a passinggrade. Future plans for educational data mining beyond the focal course are discussed. This worksuggests some opportunities for
with faculty affiliated with the program,and peer/near-peer mentoring. At the time of data collection, the program was in its third cohort.Participants and Recruitment: All participants in this study are first- or second-year MS studentsenrolled in an engineering field at the institution of focus in this study. All M.S. students arerequired to do research and write a Master’s paper or thesis. All participants for this study recruitedwere part of the SSTEM, although participation in this particular study was optional. IRB approvalwas obtained for the entire project and all data collection; the interviews collected and analyzed inthis study are part of the broader engineering education research plan in the funded SSTEM project.Six students
undergraduate students’working with potential supervisors on research projects actually demolishes theirchances to practice complex problem skills. 2) The competing demands on facultyactually temper teachers’ enthusiasm and input in developing CPS ability. In fact,after joining the Washington Agreement in 2016, Chinese universities haveaccelerated the process of engineering education reform [26]. Although China'sengineering education certification standards mandate the inclusion of CPScompetency as a graduation requirement, universities have not yet implemented aspecific training plan to meet these requirements due to various constraints. As aresult, teachers have no extrinsic motivation to focus on students' CPS competencycultivation in
. Raviv, W. Zhou and A. Shapira, "Safety by design: dialogues between designers and builders using virtual reality," Construction Management and Economics, vol. 33, p. 55–72, 2015.[19] D. Zhao and J. Lucas, "Virtual reality simulation for construction safety promotion," International journal of injury control and safety promotion, vol. 22, p. 57–67, 2015.[20] J. Goh, S. Hu and Y. Fang, "Human-in-the-loop simulation for crane lift planning in modular construction on-site assembly," in Computing in Civil Engineering 2019: Visualization, Information Modeling, and Simulation, American Society of Civil Engineers Reston, VA, 2019, p. 71–78.[21] P. Wang, P. Wu, H.-L. Chi and X. Li, "Adopting lean thinking in
analysis plan details instrument reliability and validation test.Data Collection and Analysis PlanData and evidence gathering for the needs assessment main study are ongoing. It involves facultyand students. A sample size of 200, or the ratio of the number of cases to the number of variablesof 10:1, has been described as a sufficient sample size for Exploratory factor analysis (EFA) andConfirmatory Factor Analysis (CFA) [16]. Recruitment emails have been sent through thecenter of excellence to the 5 EPUs. The current recruitment update shows that about 1000participants have been recruited. The survey has been administered through the mountain westinstitution Qualtrics website.For the reliability test, Cronbach alpha (> 0.7) and correlation
us to conduct research “with” and not “on”another, situating the Other as equals not to change the other but to change self [18].MethodsAnuli and Glory discovered that they had similar interests at an online conference and thereafterthe idea of writing a duoethnography emerged, they then invited Kelly to join the team, and webecame a trio-ethnography. Our research process started in July 2022 when we participated in a 5-week workshop where we worked with mentors to refine our research plan. Thereafter, we metevery Monday for an hour over 6 months to execute our study. Collaboration tools utilized wereZoom, Google Drive, WhatsApp, and emails. After our interview questions were drafted to guideour dialogue, we emailed them to a faculty member
necessary to work in a PBLenvironment. In this way the development of interpersonal, structural, task planning etc skillsbecomes an explicit part of our curriculum, which is assessed separately to the project basedcourses in which these skills are applied.The assessment task consists of an approximately 2000 word essay that is produced underexamination conditions. It is submitted electronically through the learning managementsystem Moodle. Each year around 1500 students complete this essay. The essay is co-marked by both PBL experts and department staff, with a total workload allocation of 20minutes per essay. The assignment is marked with a simple pass/fail determination, and noexplicit feedback beyond the pass/fail grade is provided to the
the extra credit awarded for completing the survey orother events affecting their feelings at the time of filling the survey.Some broad implications of this study are to develop effective tools for students to strengthentheir Information Gathering skills through various resources. In other words, how a decision canbe optimized with the benefit of reaching different people, using different processes andproducts. For example, if a student is required to decide towards selecting their majors, one ideacould be to reach out to different people (advisors from university and industry) and visualizingstep-by-step prospective career plans for students. Through such a holistic Information Gatheringprocess, advisors could be assured that students would
as part of their regularduty also allows courses to define standards for flexibilization and fairness between cases ofsimilar nature across different terms.Our early observations show that scaling this initiative to more large-enrolment courses may bechallenging. These challenges are associated with a high workload of the WTA. Increasing theratio of WTAs to the number of students has associated costs. Thus, we are currently studyingways to automate some of the tasks of the WTA, without sacrificing the human connection that isso key to the practice. As a next step, we plan to extend the practice to cover first-year courseslike Calculus, Linear Algebra, and Physics. Another key challenge is the engagement of teachingstaff; as mentioned above
]. Natural LanguageProcessing (NLP) uses machine learning methods like transformer-based machine learningmodels [7], [8], which can be used through fine-tuning or in-context learning methods. NLP canbe used to train algorithms that can automate the coding of written responses. Only a few studiesfor educational applications have leveraged transformer-based machine learning models, furtherprompting an investigation into its use in STEM education. However, since language analysis ischallenging to automate because of its complexity, NLP has been criticized for increasing thepossibility of perpetuating and amplifying harmful stereotypes and implicit biases [9], [10].This study details preliminary results to plan for using NLP for linguistic justice
. Seifert, A. L. Patalano, K. J. Hammond, and T. M. Converse, “Experience and expertise: The role of memory in planning for opportunities,” in Expertise in context: Human and machine, Menlo Park, CA: AAAI Press, 1997.[14] K. M. Martin, E. Miskioglu, C. Noble, A. McIntyre, C. S. Bolton, and A. Carberry, "Predicting and Evaluationg Engineering Problem Solving (PEEPS): Instrument Development," presented at the Research in Engineering Education Symposium & Australasian Association for Engineering Education Conference, Perth, Australia, 2021.[15] P. S. Steif and J. A. Dantzler, “A statics concept inventory: Development and Psychometric Analysis,” Journal of Engineering Education, vol. 94, no. 4, pp. 363–371
3 4 5 6 7 teams I like the objectivity of engineering education 1 2 3 4 5 6 71=Strongly Disagree, 2=Disagree, 3=Slightly Disagree, 4=Neutral, 5=Slightly Agree, 6=Agree,7=Strongly AgreeThe list of items is not final. Our ongoing research may direct us to add/remove or amend items.Our future work aims to further refine and psychometrically validate the EUSWQ. 4.1 PSYCHOMETRICS OF EUSWQ AND FUTURE WORKFor our future work, we are planning to validate the EUSWQ after presenting it to a larger numberof the undergraduate engineering student population. We aim to conduct two types ofpsychometric validation analysis. As part of the structural validity of the EUSWQ, exploratoryfactor analysis (EFA) will be conducted to verify
, ability or personal values essential information or present, or motivation to relative to knowledge/skill operations to make demonstrate perform quantitative with definitions, a decision, quantitative quantitative tasks information, equations, basic compare/contrast, information or operations, and quantitative build a model, concepts to an tasks. operations project, plan, etc. external or pseudo audienceTable 3: Coding indicators used to determine if a student
criteria) for and the search results from a pilot review (see Pilot Review),how the lessons learned gained from the pilot review were incorporated into refining the searchstrategy (see Lessons Learned and Search Strategy), and our future plans (see Future works).Pilot ReviewDue to the absence of a relevant a priori protocol registered with the Open Science Framework forthe current scoping review, a pilot scoping review was conducted to develop one. This pilot reviewalso aims to establish a systematic search strategy (e.g., a search string, search database, inclusioncriteria) to identify a broad range of primary literature, aligning with the goals of our scopingreview (Arksey & O’Malley, 2005). This process also served to enhance the research
transfer KSAs from one module to another, as there are many issues leftunanswered. As mentioned above, having the students consciously transfer what they knowfrom prior modules could academically challenge them more than usual; Any intervention ofa similar nature might face the same problem. A student having an improved attitude totransfer could find it harder to transfer what they know, thus creating more barriers. Thebalance between these two elements is a topic that is without a doubt important and needs tobe explored further. Ultimately, this intervention makes for a good starting point to increasethe transfer of learning behaviours in engineering students.Future Directions (Work-in-Progress)Moving forward, this research plans to conduct
often in contrast with students’desired learning experience, as further explained in the discussion.Survey Quantitative ResultsAs summarized in Table 2, all participants used laptop computers to access Ecampus coursematerials, and 48 of the 58 participants used their phone for coursework as well. Others also useddesktop computers (23 participants) and tablets (14 participants). For content accessed via a webbrowser, Chrome was the most common browser for engaging with Ecampus course material (37participants). Next were Firefox (12) and Safari (7), followed by one user for each of Edge andOpera. For the tablet and phone users, Wi-Fi was more common than using phone plan data forconnecting with course materials, but not all respondents used Wi-Fi
the experience in the way of fieldnotes after the observationis completed. In the context of engineering and STEM education, several observations protocolshave been developed to study teaching practices and instructional effectiveness. Below wedescribe some of the most commonly used observation protocols:Teaching Dimension Observation Protocol (TDOP). Based on the instructional systems-of-practice framework, the TDOP was developed to observe course planning and classroominstruction [5], [6]. The TDOP is broken down into six dimensions of practice: teaching methods,pedagogical strategies, cognitive demand, student-teacher interactions, student engagement, andinstructional technology. Each of these dimensions has between four and 13 individual
/10.3758/BF03197722Miskioğlu, E. E., Aaron, C., Bolton, C. S., Martin, K. M., Roth, M., Kavale, S. M., & Carberry, A. R. (2023). Situating Intuition in Engineering Practice. Journal of Engineering Education. https://doi.org/10.1002/jee.20521Reed, S. K. (2016). The structure of ill-structured (and well-structured) problems revisited. Educational Psychology Review, 28(4), 691-716. https://doi.org/10.1007/s10648-015-9343-1Salanda, J. (2021) The Coding Manual for Qualitative Researchers. SAGE Publications Ltd.Seifert, C. M., & Patalano, A. L., Hammond, K. J., & Converse, T. M. (1997). Experience and expertise: The role of memory in planning for opportunities. In P. J. Feltovich, K. M. Ford, & R. R. Hoffman (Eds
Engineering Experiences SurveyAbstractThis research paper presents validity evidence for a sophomore engineering experience surveythat provides an initial understanding of how sophomores experienced their second year ofengineering studies. While the sophomore year is a pivotal transition for engineering students,existing research and practices have largely overlooked this crucial period. There is a need toassess these students and understand more about their college experiences so interventions canbe planned and implemented. The primary aim of this research is to establish validity evidencefor the scales used in the Sophomore Engineering Experiences Survey (SEES). The survey wasadapted from Schreiner’s Sophomore Experiences Survey and guided by
current study addresses the following research questions: 1. What motivates students to attend scheduled class sessions with ungraded attendance? 2. Are there differences in motivating factors that depend on the structure of the class session (in this case, lecture versus laboratory)?This paper presents preliminary results from an end-of-semester survey, and discusses plans forrepeating the survey in a future offering of the course.MethodologyThe survey design was inspired by surveys of attendance in prior work [7], [8], [9], with theaddition of open-ended questions, consisting of the following prompts regarding lecture sessions: 1. Please estimate the percentage of lectures that you attended prior to the first exam. 2. Please
. Chase, “Engineering stress culture in project-based engineering programs,” in Proceedings of the 2022 Annual Conference of the American Society for Engineering Education, Minneapolis, MN, USA, June 2022.[16] S. Lovibond and P. Lovibond, Manual for the depression anxiety stress scales (2nd edition). Psychology Foundation, 1995.[17] P. M. H. S. Jones, B.D and T. Knott, “An analysis of motivation constructs with first-year engineering students: Relationships amongh expectancies, values, achievement, and career plans,” Journal of Engineering Education, vol. 99.[18] M. H. Lee, W.C and P. Brown, “Measuring underrepresented student perceptions of inclusion within engineering departments and universities,” International Journal
guidance in the planning and implementation of the intervention[9]–[14]. An initial development of a proactive advising survey instrument is reported. Surveyitems were drawn from two validated sources: the MMRE survey instrument[5] and theSUCCESS instrument[15], [16]. A concise short-form instrument is desired for the currentapplication to maximize the likelihood students will complete the entire survey. Since both theMMRE and SUCCESS instruments are relatively long, a subset of questions from theseinstruments is initially included. Seven questions were selected for each of the four constructs:self-efficacy, teamwork self-efficacy, engineering identity, and commitment to an engineeringcareer. Recognizing that the validity and reliability of
size of the interviews we conduct as well as initiating the interviews withother categories (employers, multiple universities). While our activities focus on academicmakerspaces, we plan to validate our data against the breadth of design and fabrication studios,including a variety of Makerspace operational structures in public/private institutions,community, and vocational colleges. After concluding both Activity 1 and 2 we will develop afinal report providing useful guidance on the value of investments in design and fabricationstudios for organizations who make education investment decisions.The tangible outcomes of this study – such as the specific forms of the developed tools forassessing makerspaces – will be more fully realized as the