in their capabilities of using CAD software. Therefore, there iscurrently a lack of research investigating how students develop self-efficacy in relation to CADprior to their undergraduate degree.As there currently does not exist a validated scale to measure CAD self-efficacy, in this paper,we explore the related concepts of undergraduate engineering students’ initial 3D Modeling andEngineering Design self-efficacy before formal CAD instruction at the university level.Bandura’s Theory of Self-Efficacy suggests there are four main sources of self-efficacy: masteryexperiences, social persuasion, vicarious experiences and physiological states [1]. Therefore, weaim to answer the question: “What prior CAD learning experiences influence
: https://sites.psu.edu/learningfactory/students/edsgn-100-cornerstone/[28] D. Baker, S. Krause, and S. Purzer, “Developing an instrument to measure tinkering and technical self efficacy in engineering,” presented at the 2008 Annual Conference & Exposition, 2008, pp. 13–392.[29] E. Anderson, “The white space. Sociology of Race and Ethnicity, 1 (1), 10-21,” 2015.
submitted by six studentswith non-Nursing projects and 15 students with Nursing projects.However, the pre-Empathy survey results in Table 3 do demonstrate that Engineering first-yearstudents, regardless of the assigned project, are empathic. Hess, et al. constructed their Empathysurvey with a 9-point Likert scale. At week 14 of the design project, the average item responsefor the Interpersonal Self-Efficacy, Empathetic, and Perspective-Taking subscales ranged from7 to 9. In contrast, when the Empathy Survey was deployed in an introductory biomechanicscourse at another institution, the average item response for these subscales ranged from 6 to 7[41]. Future research could conduct measurement invariance tests to examine directly whetherthe magnitude
academic climate, grades and conceptual understanding, self-efficacy and self-confidence, high school preparation, interest and career goals, and race and gender.” [5]There have been repeated calls to reimagine engineering education to better prepare students forthe 21st century (e.g. [6]). Institutions across the country have redesigned their introductorycourse experiences in engineering in recent years. This includes, for instance, Oregon StateUniversity [7], James Madison University [8], Norwich University [9], Portland State University[10], Temple University [11], Clarkson University [12], and University of California, Irvine [13],among others. However, this is not a US-specific phenomenon, with institutions in othercountries reporting similar
identity: the impact of practice-oriented learning experiences,” International Journal of STEM Education, Vol. 10, No. 48, 2023, https://doi.org/10.1186/s40594-023-00439-2[4] N. Mamaril, E. Usher, C. Li, D. Economy & M. Kennedy, “Measuring Undergraduate Students' Engineering Self-Efficacy: A Validation Study. Journal of Engineering Education,” Journal of Engineering Education, Vol. 105, No. 2, pp. 366-395, Apr. 2016. 105. 10.1002/jee.20121.[5] G. Zhang, T.J. Anderson, M.W. Ohland, R. Carter & B.R. Thorndyke, “Identifying factors influencing engineering student graduation: A longitudinal and cross-institutional study,” Journal of Engineering Education, Vol. 93, No. 4, pp. 313–320, Oct. 2004
Data Science and Analytics, Feb. 2024, doi: 10.1007/s41060-024-00509-w.[19] R. H. Kilmann and K. W. Thomas, “Developing a forced-choice measure of conflict- handling behavior: The" MODE" instrument,” Educational and psychological measurement, vol. 37, no. 2, pp. 309–325, 1977.[20] A. C Graesser, P. W. Foltz, Y. Rosen, D. W. Shaffer, C. Forsyth, and M.-L. Germany, “Challenges of assessing collaborative problem solving,” Assessment and teaching of 21st century skills: Research and applications, pp. 75–91, 2018.[21] D. A. Kolb, Experiential learning: Experience as the source of learning and development. FT press, 2014.[22] A. Bandura, “Self-efficacy: toward a unifying theory of behavioral change.,” Psychological review, vol
.” AMCIS 2004 Proceedings. 397.[4] Milligan, S. K., and Griffin, P., 2016, “Understanding Learning and Learning Design in MOOCs: A Measurement-Based Interpretation,” Journal of Learning Analytics, 3(2), pp. 88–115.[5] Jonassen, D. H., 1995, “Operationalizing Mental Models: Strategies for Assessing Mental Models to Support Meaningful Learning and Design¬ Supportive Learning Environments.” CSCL ’95 Proceedings. 182-186[6] Bucciarelli, M., 2007, “How the Construction of Mental Models Improves Learning,” Mind and Society, pp. 67–89.[7] Ramalingam, V., Labelle, D., and Wiedenbeck, S., 2004, Self-Efficacy and Mental Models in Learning to Program. SIGCSE Bull. 36, 3 (September 2004), 171–175.[8] Hwang, G. J., Shi, Y. R., and Chu
Education in the 21st Century 78.1 (2020): 61-79.[13] J. Hurley, Rubrics and the dehumanization of Education. Medium, August 12, 2020. https://profhurley.medium.com/rubrics-and-the-dehumanization-of-education-19f1907860e6.[14] K. Polston, "Students' Perceptions and Attitudes towards Rubric Assessment of Creativity." International Textile and Apparel Association Annual Conference Proceedings. Vol. 73. No. 1. Iowa State University Digital Press, 2016.[15] E. Panadero and M. Romero, To rubric or not to rubric? The effects of self-assessment on self-regulation, performance and self-efficacy, Assessment in Education: Principles, Policy & Practice, 21:2, 133-148, 2014, DOI: 10.1080/0969594X.2013.877872.[16] E. Panadero, and A
2023,shortly after they completed their respective interventions.SurveysTo understand the interventions’ impact on sense of belonging and engineering identity, programparticipants responded to a retrospective pre- and post-questionnaire that combined two validatedsurvey instruments: Godwin’s [9] engineering identity scale and Hanauer et al.’s [11] measure ofpersistence in the sciences (PITS). The PITS combines five other validated instruments thatmeasure project ownership-emotion, project ownership-content, science identity, self-efficacy,scientific community values, and networking on a five-factor scale. These variables have beenshown to predict psychological factors that influence students’ intent to stay in science andengineering
program for the social, behavioral, and biomedical sciences,” Behavior Research Methods, vol. 39, no. 2, pp. 175–191, May 2007, doi: 10.3758/BF03193146.[38] M. Hainselin, A. Aubry, and B. Bourdin, “Improving Teenagers’ Divergent Thinking With Improvisational Theater,” Front. Psychol., vol. 9, p. 1759, Sep. 2018, doi: 10.3389/fpsyg.2018.01759.[39] J. A. Mourey, “Improv Comedy and Modern Marketing Education: Exploring Consequences for Divergent Thinking, Self-Efficacy, and Collaboration,” Journal of Marketing Education, vol. 42, no. 2, pp. 134–148, Aug. 2020, doi: 10.1177/0273475318822087.[40] P. Felsman, S. Gunawardena, and C. M. Seifert, “Improv experience promotes divergent thinking, uncertainty tolerance, and affective well
Paper ID #42725Board 68: Integration of Learning by Evaluating (LbE) within the 5E InstructionalModel in Engineering-Design EducationDr. Wonki Lee, Purdue University Wonki Lee received a Ph.D. in Education, Curriculum Instruction, Language and Literacy at Purdue University. She received her bachelor’s and master’s, specializing in Korean language education as a second/foreign language, from Seoul National University, South Korea. Her research interests are self-efficacy, culturally responsive teaching, and machine learning in a diverse educational setting.Prof. Nathan Mentzer, Purdue University Nathan Mentzer is a
, while later modules build in complexity to focus on integrating these newfoundskills and knowledge. Within each week’s module, learning also builds towards articulatedlearning goals made known to learners via a Canvas Overview and Wrap-up, agendas during in-class activities, and (light) assignment rubrics. The repeated weekly structure creates a familiartempo that fosters both learner and student-teacher self-efficacy, guiding learners while theybuild up their engineering project portfolios. We provide examples of the Canvas LearningManagement System artifacts in the figures below. Figure 1: Canvas depiction of the full course module structure of two First Year Design offerings, as designed by student- teachers: Intro to Cybersecurity (Left
developed by the research team to assess the effect of the course on self-efficacy as well as their interests in STEM, design, and robotics; while the university-administered evaluation is the standardized course evaluation that are conducted for all coursesacross campus. The objective of the university-administered evaluation is to gather feedbackfrom students regarding their learning experiences, the effectiveness of the instructor, and theoverall quality of the course. The evaluation serves as a valuable tool for the instructor andadministrators to assess teaching methods, identify areas for improvement, and make informeddecisions about curriculum development and faculty performance. The anonymous university-administered course evaluation was
suspect. Eliminating them from consideration does not alter the generalfindings. Finally, effect sizes were calculated (r values in Tables A3 to A8). These“measure…the closeness of association of the points in a scatter plot to a linear regression line”[27] and are associated with a scale categorizing the closeness of association (e.g., noassociation, very weak, weak, etc.) [27, 28]. While findings are discussed using p values, acommon practice in presentation of pre- and post-instruction measures of educationalinterventions, it is the r values that were used to interpret the patterns and arrive at the study’sconclusions.Persistence and graduation rates of native students and those who transferred to the institutionwho had completed one of the