information through a series of courses taken byundergraduate students also needs to be studied. These issues are addressed in ongoing studieswhich will be reported later. Further, the scalability of this approach will also be studied in otherengineering schools in the future. Although this study focuses on the tools, course content,elements of structure and process of learning, it does not specifically address the role andinfluence of faculty on the learning environment.Acknowledgements: Support for this work is provided by the National Science Foundation Award No. DUE1504692 and 1504696. Any opinions, findings, and conclusions or recommendations expressedin this paper are those of the authors and do not necessarily reflect the views of the
of learning and engagement in a makerspace environment.Our analysis revealed the instrument had acceptable levels of reliability (above .65 and below.95), whichwe maintain makes the instrument suitable for assessing student perceptions and engagement inmakerspaces. Further, the acceptable reliability indicates students are answering the items consistentlywhich further reflects alignment of the items to our four constructs of interest.We were able to provide an additional level of assurance that our instrument is aligned with ourassessment goals through our analysis of the students’ responses in conjunction with their individualcharacteristics. The only association with student individual characteristics we found to be predictive ofthe survey
capstonedesign curriculum, the current two year research project was designed to implement and assessthe efficacy of the activities as an integral part of the course. IC activities have been incorporatedin the USAFA capstone design course previously, but their effects were not directly studied.Nevertheless, faculty observations and customer feedback suggested that creativity and productinnovation improvements occurred. Thus, sufficient anecdotal evidence existed to motivatefurther formal examination of the impact of IC activities on the USAFA engineering designprocess and capstone design course.Since the underlying conceptual process of the capstone design course and the DI activityexperience reflects the divergent thinking processes it is appropriate
. Kimball, and R. D. Reason, “Understanding Interdisciplinarity: Curricular and Organizational Features of Undergraduate Interdisciplinary Programs,” Innov. High. Educ., vol. 38, no. 2, pp. 143–158, 2013.[8] B. A. Masi, A. E. Hosoi, and S. A. Go, “Re-Engineering Engineering Education: A Comparison of Student Motivation, Ability Development and Career Paths in Traditional and Cross-Disciplinary Engineering Degree Programs,” Am. Soc. Eng. Educ., 2011.[9] J. Berglund, “The Real World: BME graduates reflect on whether universities are providing adequate preparation for a career in industry.,” IEEE Pulse, vol. 6, no. March- April, 2., pp. 46–49, 2015.[10] R. H. Harrison, J.-P. St-Pierre, and M. M. Stevens, “Tissue
evaluate theseresults in the context of a larger and a more longitudinal study. Nevertheless, the resultspresented here offer strong support for including more engineering challenges that embracesocial responsibility in the undergraduate engineering curriculum.AcknowledgmentsThe authors would like to gratefully acknowledge the National Science Foundation for theirsupport of this work under the TUES program (grant number DUE-1245464). Any opinions,findings, and conclusions or recommendations expressed in this material are those of the author(s)and do not necessarily reflect the views of the National Science Foundation.References[1] National Academy of Engineering, “Grand Challenges - 14 Grand Challenges for Engineering,” 03-Feb-2019. [Online
Engineering Education.21. Huff, J. L., Smith, J. A., Jesiek, B. K., Zoltowski, C. B., Graziano, W. G., & Oakes, W. C. (2014). From methods to methodology: Reflection on keeping the philosophical commitments of interpretative phenomenological analysis. 2014 IEEE Frontiers in Education Conference (FIE) Proceedings.22. Smith, J.A., Flowers, P., Larkin, M. (2009). Interpretative Phenomenological Analysis: Theory, Research, Practice. London: Sage.23. Godwin, A., Potvin, G., Hazari, Z., & Lock, R. (2013). Understanding engineering identity through structural equation modeling. 2013 IEEE Frontiers in Education Conference (FIE).24. Hazari, Z., Sonnert, G., Sadler, P. M., & Shanahan, M. (2010). Connecting high school physics
provided strong evidence of validity for the EPRA tool from someof the interview cases examined.AcknowledgementsThis material is based on work supported by the National Science Foundation under Grant#1158863. Any opinions, findings, and conclusions or recommendations expressed in thismaterial are those of the author(s) and do not necessarily reflect the views of the NationalScience Foundation.Bibliography1 ABET, "Criteria for Accrediting Engineering Programs Effective for Evaluation During the 2015-2016 Accreditation Cycle," ABET Engineering Accreditation Commission, Baltimore, MD, 2014.2 L. J. Shuman, M. Besterfield-Sacre and J. McGourty, "The ABET "Professional Skills" - Can They Be Taught? Can They Be Assessed?," Journal of
) between students’ use of representations in each stageand the respective scores. One exception was the configuration step, where the score was well correlatedwith the amount of representations used. The correlation resulted in a value of .52 (p-value = .002) for theconfiguration step. To evaluate the effects of each type of representation on this score, we performed amultiple linear regression. Equation (1) describes the model used to predict students’ score on theconfiguration step (SC) based on the numbers of images, plots, tables, equations, calculations, and charts.Results reveal a significant effect of the use of equations on this stage (p-value < 0.016). No other type ofrepresentation had significant effect. This fact could reflect the
, problem solving, and studentengagement during class using a structured behavioral observation protocol known as theTeaching Dimensions Observation Protocol (TDOP). Several of the traditionally-taught classsessions were also observed for comparison, with positive results noted. Also, a comparison ofstudents’ conceptual and exam performance in the two flipped sections versus the “traditional”section enabled direct assessment of the benefits of the new approach, with significantdifferences not being detected. Further assessment of the flipped “pilot” classroom includedstudent engagement, instructors’ reflections, and two perception instruments measuring students’overall experience in the class.1. Introduction and Literature ReviewNumerous
consistency between the twoauthors, Dedoose Training feature was utilized by having both researchers take the codeapplication test. A test was set by each researcher and both researchers take each other’s test toevaluate how consistent the coding process was done with respect to each other. Multiple testswere set and done after each iteration as the researchers deliberate to ensure all the coding donewas consistent. The test gives a Pooled Kappa that reflects the agreement between bothresearchers. The final two tests gave a Pooled Kappa of 0.67, which falls in the range of goodagreement between both researchers. The relationship between Pooled Kappa and Cohen’s kappa(a measure to evaluate inter-rater agreement) is that the Pooled Kappa is a global
different ways. For the mostpart however, there was a common factor that many students identified through the interview. Ifthe student received a “not mastered” mark, they would almost always redo the problem,regardless of overall performance level.Some exceptions to this occurred. First, students with poor attendance records and poor recordsof turning in the original assignment would also sometimes skip turning in resubmissions. This isnot viewed as a direct result of the mastery grading system, rather a reflection of general poorparticipation by a small percentage of students across any system. Second, some studentsindicated that if multiple resubmits piled up and important coursework from other classes alsobecame time consuming, they would
engineering. These studies could add tothe limited body of research in engineering education regarding intentional conceptual change aswell as interventions to promote conceptual change for other content.Conclusions This research study demonstrated that undergraduate engineering students havemisconceptions about the emergent characteristics of drift. Through a written protocol andsubsequent analysis, specific misconceptions of emergence for drift were identified and werefound to be prevalent. Even though some misconceptions observed here reflected those that havebeen reported in the literature about emergence, relationships between the differentmisconceptions were observed. Misconceptions for the emergent characteristics of thephenomena were
Theory-Based Approach to Reflective Planning and Instruction, Faculty of Education, University of Regina. Appendix A: Student SurveyTOPIC: Course Name Please rate the following questions based on the scale given below. 1 Strongly Disagree 2 Disagree 3 Neutral 4 Agree 5 Strongly Agree 1. The course was effective in helping me learn the 1 2 3 4 5 material presented. Page 26.1533.142. The course was effective in helping me to understand 1 2 3 4 5 the material. 3. The course format
. According to Downing and Haladyna6, validity is the mostimportant consideration in test evaluation and refers to the appropriateness, meaningfulness, andusefulness of the specific inferences made from test scores. Haynes et al.7 go on to warn us thatdata from an invalid instrument can “over-represent, omit, or under-represent some facets of theconstruct and reflect variables outside the construct domain”.The use of unreliable and/or invalid instruments in engineering education could lead to theinaccurate measurement of student outcomes and perceptions, incorrect program and classassessments, as well as a general misrepresentation of the current state of engineering education.In this paper, we propose a structured methodology for the initial steps
to which respondents indicate their level of agreement on a Likert four-pointscale, from strongly agree to strongly disagree. Participants respond to the 26 items for each ofthe three classroom strategies (formative feedback, real-world applications, and initiatingstudent-to-student discussions), thus yielding 78 datum points.VECTERS additionally contains questions to collect demographic information about theinstructors as well as general information about the engineering course they are reflecting uponwhen responding to VECTERS. Instructor information includes information such as gender,ethnicity, and years of experience. Course information includes items to indicate the course-level(100 to 400), whether the course is required, and the
reliability of each survey tool. To establish content validity,the NSSE relies on a panel of experts and uses student self-report data.31 In terms of reliability,NSSE has a reported value of 0.70 or higher for deep learning which includes higher-order,integrative, and reflective learning items.32 Reliability values close to or above 0.70 are generallyconsidered acceptable in statistical analysis.33 In terms of response process validity, NSSE usedcognitive interviews and focus groups to determine that the survey was valid for students ofdifferent races/ethnicities.34 ECAR has not published information on the validity or reliability ofits questionnaires.Since the present study relied on a newly constructed assessment tool, a panel of experts wasused to
positioned asone who generates new ideas and is described with verbs such as reflecting, integrating, and self-explaining. Actively engaged students are similar to constructively engaged students; however,they differ in that actively engaged students manipulate content material without generating newideas or concepts. Activities carried out by these students are often described by verbs such asrepeat, rehearse, and copy. Lastly, passively engaged students are instruction-oriented andreceive information through listening, reading and watching. Importantly, these modes ofengagement are not rigid categories used to describe students. Individuals may demonstrate arange of engagement modes and behaviors throughout their learning, and may engage
]need to implement a rigorous system of evaluation of their pedagogical assessments through theuse of a measurement model that makes such demands on the data. To that end, theimplementation of Rasch measurement models will provide robust validation for the measures ofstudent learning outcomes, which in turn can improve course curricula by accurately targetingdomains and transferable skillsets critical to the development of this generation’s chemicalengineers.AcknowledgementsThis material is based upon work supported by the National Science Foundation under Grant No.DUE 1712186. Any opinions, findings, and conclusions or recommendations expressed in thismaterial are those of the author(s) and do not necessarily reflect the views of the
a differentperspective of how a student’s URM identity could affect their progress towards degree completion. Finally,it provides institutions with recommendations on how to improve their support for students towards doctoraldegree completion. Acknowledgements This research was supported by the National Science Foundation under Award No. 1723314. Anyopinions, findings and conclusions or recommendations expressed in this material are those of the authorsand do not necessarily reflect the views of the National Science Foundation. References [1] M. Sana, “Immigrants and natives in US science and engineering occupations, 1994–2006,” Demography, vol. 47, no. 3, pp. 801–820, 2010. [2] “Engage to Excel: Producing One
thismaterial are those of the authors and do not necessarily reflect the views of the NSF. The authorsacknowledge the larger research study team including Amy Arnolds for her help with intercoderwork. We also thank our study participants and partner school liaisons.References[1] C. Amelink, and E. G. Creamer, “Gender differences in elements of the undergraduate experience that influence satisfaction with the engineering major and the intent to pursue engineering as a career,” Journal of Engineering Education, vol. 99, no. 1, pp. 81-92, 2010.[2] S. Sheppard, S. Gilmartin, H. L. Chen, K. Donaldson, G. Lichtenstein, O. Eris, M. Lande, and G. Toye, “Exploring the engineering student experience: Findings from the Academic Pathways
identified as Asian, Hispanic or White. As stated previously this may reflect thestudents’ willingness to participate in engineering’s culture, although at this time no conclusiveevidence, and presents a clear arena for future work.Out-degreeHaving established that social structure was receptive to diverse interactions, we tested to see if aparticular racial group was more socially active than their peers. The descriptive statistics (Table3) suggest that out-degree behavior is highly volatile (large standard deviations and range),positively skewed and extremely leptokurtic. KW testing (H(5) = 5.6179, p = .3452) concludesthat out-degree values are not dependent on the students’ racial/ ethnicity identification. Table 3: Descriptive statistics for
approaches with technical engineering skills. This requires anenhanced curriculum with a focus on student teamwork, a greater consideration of social context,improved communication with diverse constituents, and reflection on an ethical understanding oftheir decisions and solutions. Effective faculty members need to mirror these values and skills intheir instruction and mentoring. Efforts have begun to reimagine the “engineering canon” whichrequires a shift from positioning engineering as a purely technical endeavor to framing it associo-technical. We are developing a new General Engineering program that incorporates thisperspective [30]. In addition, we are developing modules that emphasize the sociotechnicalnature of engineering for traditional
recommendationsexpressed in this material are those of the author(s) and do not necessarily reflect the views ofthe National Science Foundation.References[1] T. G. Duncan and W. J. McKeachie, “The making of the Motivated Strategies for Learning Questionnaire,” Educ. Psychol., vol. 40, no. 2, pp. 117–128, 2005, doi: 10.1207/s15326985ep4002_6.[2] R. H. Liebert and L. W. Morris, “Cognitive and emotional components of test anxiety: A distinction and some initial data,” Psychol. Rep., vol. 20, pp. 975–978, 1967.[3] R. L. Matz et al., “Patterns of gendered performance differences in large introductory courses at five research universities,” AERA Open, 2017, doi: 10.1177/2332858417743754.[4] B. King, “Changing college majors: Does it
to discuss how their commitment to their interests also led them to mentor and serveothers using their knowledge. This pattern commonly employed by the participantsremarkably reflects the three tenants of Stewardship Theory as if it were a template used toconstruct each personal statement. The guidelines provided for the GRFP orient applicants todemonstrate their aptitude for conserving, generating, and transforming knowledge.Stewardship Theory constitutes the implicit framework applicants are led to use in their bid todemonstrate their viability as graduate students.ConservationEach participant demonstrated how they grew to become stewards of their discipline throughtheir learning and studies. They distinguished themselves from their peers
to other STEMdepartments to understand the generalizability of results beyond the discipline and institutionstudied, using this analysis approach as a guide. It is also important to begin to consider howequity considerations factor into graduate funding allocations, meaning what students arereceiving what types of sequential funding and how that impacts persistence and completion forwomen and Students of Color.AcknowledgementsThis research was funded by the National Science Foundation through grants #1535462 and#1535226. Any opinions, findings, and conclusions in this article are the authors’ and do notnecessarily reflect the views of the National Science Foundation. We would like to thank ourcollaborators for their contributions to this
students' motivation, goals, and self-efficacy on performance," in Proceedings of the 2016 ACM Conference on International Computing Education Research, 2016, pp. 211–220.[49] D. Heo, S. Anwar, and M. Menekse, "The relationship between engineering students' achievement goals, reflection behaviors, and learning outcomes," Int. J. Eng. Educ., vol. 34, no. 5, pp. 1634–1643, 2018.
Policy Analysis, 31(4), 441-462.[6] Carter, D. F., Ro, H. K., Alcott, B., Lattuca, L. R. Co-Curricular Connections: The Role ofUndergraduate Research Experiences in Promoting Engineering Students’ Communication,Teamwork, and Leadership Skills. Research in Higher Education, v57 n3 p363-393. May 2016.[7] Cassady, J.C., Johnson, R.E. Cognitive Test Anxiety and Academic Performance.Contemporary Educational Psychology. Vol. 27 (2), 270-295. 2002.[8] DeHaan, R. L. (2005). The Impending Revolution in Undergraduate Science Education.Journal of Science Education & Technology, 14(2), 253-269. doi: 10.1007/s10956-005-4425-3.[9] Doel, S. (2009). Fostering Student Reflection During Engineering Internships. Asia-PacificJournal of Cooperative Education, 10
peers; this begs deeperquestions about the meaning of GPA as an absolute or relative indicator of success, and the policyof departments in adhering to one such definition. Still, from an outcome-oriented perspective,students leaving STEM and excelling in their non-STEM discipline would be expected to have anincrease in GPA, though the challenge is to separate what proportion of that increase is due todifferences in program match and what is due to differences in program rigor.Causal Implications for Policy. Related to the above, the RISE serves as a diagnostic, but notnecessarily a prescriptive tool for changes to improve student success. Departments and collegeswould need to reflect on the true explanation for the observed results, or collect
less constrained problem doesn’t always yield a higher solutiondiversity, and how in some cases, the structure of the course itself can be used to motivatestudents’ independent thinking in a design-based project. In future work we hope to analyzeways that the different pedagogical models influenced learning outcomes beyond solutiondiversity such as group dynamics.AcknowledgmentsThis material is based upon work supported by the National Science Foundation grant numberA451001 SF9018. Any opinions, findings, and conclusions or recommendations expressed inthis material are those of the authors and do not necessarily reflect the views of the NationalScience Foundation. We would also like to thank the students, teaching assistants, professors,and