objectives for thispaper was to document the details of what we did to implement the flipped classroom, includingdetails such as software choices, video length, and topic used. Here are the things that we learnedand wished that we had known when we started this. 1. Do not be afraid to try new things. When Prof. DeNucci, first brought this idea to Prof. Swithenbank, she was not excited about this. It was new and different, but after further reflection, she thought “why not give it a try?” This may work for you and it may not, but it was definitely worth trying. We would use this method again after trying it while incorporating some of these lessons learned. 2. Preparation will reduce the amount of time it takes to produce the
two experienced Freeform instructors was video recorded over the course of the Spring2016 semester and subsequently analyzed with respect to instructor actions. Continuous videocoding analysis was used to capture how much time these two instructors dedicated to variousinstructional activities such as assessments, traditional lecturing, demonstrations, and writing notesor examples in real-time. The analysis provides a clearer picture of how and when these twoveteran instructors employed active, blended, and collaborative approaches in their classrooms.The implications of the analysis are two-fold. First, we strive to improve Freeform instruction atour institution by providing instructors with an opportunity to reflect on their
developed additional SMK activities, we have generally substituted them for thewhiteboard problem solving in the overall mix of class activities, thus keeping the overallfraction of class time devoted to active learning approximately constant. Students prepare foreach class session by completing an example calculation and reflective writing assignment basedon assigned reading from two open educational resources (OERs) [16], [17]. To illustrate thisapproach we will next describe how the SMK1 activities outlined above fit into the first week ofclass sessions. The second class meeting begins with a series of ABCD questions assessingstudent comprehension of the reading reflection assignment on position vectors and Cartesiancomponents. The question
asmall subset of the resources provided (typically 2-4 resources) while overlooking the others,rather than consistently using all nine resources at their disposal. Four resources stood out asbeing most popular: peer collaboration, the lecturebook, online videos, and the course blog,which reflected the findings of Wirtz et al.34 at the departmental level within the context of theFreeform environment.Examining relationships between resource usage and academic performance Using the cluster analysis from their previous paper, Stites et al.18 examined how the nineresource-usage correlated with the students’ academic outcomes in the course (i.e., a higher finalgrades and better exam performance). Combining survey data and academic
several subsystems on a team with four or five other peopleusing suite of tools is too great,” 21 and provides the following advice, “to become competent ineach one of these areas—application and integration, team work, and tool use—students needtime, repeated experiences, and a lot of reflection on the learning.” 21 The goal of this researchproject is to create active-learning activities that create meaningful connections betweenengineering science and engineering design that teach students to apply and integrate when‘doing design’. This goal is well summarized by Dym as there must be, “a change in attitudetoward a more explicit and visible role for design as being ‘what engineering is all about.’Analysis unquestionably retains its centrality
hypothesizing, probing, and reflecting. Information is given to players/learners at justthe time they will be able to make sense of it and to use it. In a videogame, knowledge ispowerful because it can be put to productive use.I make no claim that Spumone measures up to the ideal playing/learning environment describedabove. However, it would be interesting to take a deeper look into how students are usingSpumone, and to look for affordances provided by the game that are benefiting the learningprocess. The study described in this paper is more exploratory in nature, with a goal of findingdiscernible patterns of play and patterns of learning within the “click stream” captured by thegame log files.Videogame Challenge: Spumone DropSpumone contains more
impact offaculty-mentored learning versus online learning conducted with freshmen at MIT [6].At the graduate level, the Delta Design game has been used as a tool to teach graduate studentsreflective practice. Instead of using a real problem, instructors chose to use the Delta Designgame to because it is easier to control the amount of training each student receives and levels theplaying field since no student has outside knowledge of the challenge. Additionally, theinstructor can control the focus of the game such that if the students are having difficulty creatinga viable structure, he or she can draw their focus back to reflective practice by changing thevalues of constraints to make the task easier [4].Details of the RedesignThe Delta Design
of students Statics 3 FA19−FA20 105 105 50 8 268 Dynamics 8 SP17−FA20 127 207 142 38 514 Deformable Solids 8 SP17−FA20 154 240 155 47 596The population data is broken down by the final grade received in the course to allow us to show how thecourse grade reflects on mastery. Note that during a semester the students are given feedback based onhow they are progressing in their mastery of each objective, never a letter grade. This mastery-gradingapproach is cumulative throughout an entire semester which often takes students until
and theirperformance was about the same on each area. A high mismatch indicates that a student foundsome material more challenging than other material, and their performance on gradedassignments reflects that. As a practical matter, the minimum value for 𝑆!,! is zero (the studentperforms exactly the same on each topic area) and the maximum mismatch could be as large as800 or 900 (for a student whose performance is wildly erratic across topical areas). In this study,the minimum mismatch score was 52.5, the mean was 248, and the maximum was over 700. Theclass average mismatch 𝑆!,!"#$$ , calculated via equation (1) using class averages on each topicarea in the j and k summations, was about 130, corresponding to just less than ½ letter
that seeks to promote diversity and improve transdisciplinary collaboration within the college. Specifically, I serve on the Resilience in Engineering Education Project team aimed to investigate the effects that students’ resilience and professional skills have on exam performance in technical courses.Dr. Nicola W. Sochacka, University of Georgia Nicola W. Sochacka is the Associate Director of the Engineering Education Transformations Institute (EETI) in the College of Engineering at the University of Georgia. Dr. Sochacka’s research interests span interpretive research methods, STEAM (STEM + Art) education, empathy, diversity, and reflection. She holds a Ph.D. in Engineering Epistemologies and a Bachelor of
them would be very reflective of the problem they were asked to solve. Othersworked the problem on paper and consulted the video only when unsure about a step, orsometimes to confirm that their approach was correct.Observation 2: High-achieving students watched the video during the experiment lessFigure 3 shows fixation time and dynamics course grade as a function of performance on theproblem completed during the laboratory experiment. There is a visible cluster of students whoperformed well in the course, performed well on the experimental problem, and had low fixationtime. This observation is consistent with the notion that high-achieving students need fewerinstructional supports than other students—this is why they are high achieving. Even
’ qualitativeunderstanding of basic concepts and principles. CI’s typically consist of multiple choicequestions with one correct answer and several “distractors” that reflect common misconceptions.The misconceptions are usually identified through formal research processes, such as using focusgroups in which students answer questions and explain their reasoning in an expository manner. A CI can be used to assess both individual student learning gains and effectiveness ofpedagogical strategies, particularly by measuring differences in performance via pre-test (beforeinstruction) and post-test (after instruction). If the CI is not appropriate as a pre-test, then itsability to measure learning gains might be established via other correlations, such as with
Concept Inventory20. Additionally, the moderate correlation coefficientsbetween the inventory scores and exam scores fall in the range of values found in previouspublications comparing concept scores to problem-solving skills16. This fits with the observationthat much of the final grade and the exam scores reflect assessments of problem-solving ratherthan conceptual understanding. Overall, the expert selection of questions for the 11-questionsubset and the significant correlations between the aDCI scores and other assessment metricsprovide evidence that the aDCI is sufficiently valid for use in this study. Table 2. Spearman correlation coefficient (ρ) for aDCI scores and other performance metrics. aDCI Pre-Test
reflecting in her own experiences as an undergraduate and her preference for activelearning techniques. She also notes that she would like to do more but has not had any formaltraining: Page 26.890.7“Ultimately, I do the best I can but feel that I don’t have a lot of formal training. I’d like to get it,but haven’t found the time, or taken the time, to do it … I have taken a lot of what I observed as astudent and focused on things that I liked and didn’t like. I have aspirations of using moreresearch to help develop my teaching in the future.”The faculty member who scored the highest on the RTOP also had the most formal training ineffectively
partial credit defined in the rubric. Moredetails about the rubric and the grading scheme are described in [8,9].Locating, classifying and correcting errors on exams can be a very important part of the learningprocess. This is referred to as reflection by cognitive scientists [2], and we prefer that studentsrather than graders glean this benefit. We hope that this process leads to higher accuracy andgrades in the future, all while developing an engineering mindset for checking work and locatingmistakes.Early and Frequent Assessment. In this new course design the timing and frequency ofassessment is important. It is recommended that students get two or three early assessmentsduring the first five weeks of the semester. If the assessments are left
to ensure high levels of studentlearning, engagement, and overall satisfaction.It is noted, nonetheless, that the post-survey via student feedback is subjective, and might notreflect the extent to which students learned. The responses to question six in the post-survey,however, reflect that experiments and analyses of the lab related to the strength of materials course,but do not reveal specific learning outcomes. Future research will incorporate both control and testgroups in order to initiate comparison analyses and reveal specific learning outcomes.REFERENCES[1] Amadieu, F., Mariné, C., & Laimay, C. (2011). The attention-guiding effect and cognitive loadin the comprehension of animations. Computers in Human Behavior, 27(1), 36-40.[2
5.Correlations among the self-efficacy scale scores are generally moderate to high, ranging from .4to .6, which corresponds to 15 to 35 percent shared variance. A factor analysis was conductedacross administrations and scales (including Week 1) to determine the factor structure (Tables 2& 3); the solution was restricted to two factors. This method of data reduction is done to seekunderlying unobservable (latent) variables that are reflected in the observed administrations. Asseen in Table 2, the two factors selected had eigenvalues of 1.00 or greater. This approach is thedefault in most statistical programs, such as SPPS (the program used in the analysis of ourstudy), where eigenvalues are used to condense the variance in a correlation
penalties on late homework caused that the perception of the students to perceive the instructor as not being very instructor by the students suffered helpful to them. These feelings were reflected on the every semester of the study, due evaluations of the course by the students. The course- to the enforcement of the penalties evaluation instrument that was used consisted of many on late homework. Table 1 line items, one of which asked the students to score the 13. extent to which they perceived the instructor as being helpful. In semester 1, the score given was 3.78/4; in semester 2, that score
problems in engineering mechanics.This paper outlines key findings from this Work-In-Progress study and makes recommendationsfor future work in this area.AcknowledgementThis material is based upon work supported by the National Science Foundation in the U. S.under grants number DRL-1535307 (PI: Perez) and DRL-1818758 (PI: Sorby). Any opinions,findings, and conclusions or recommendations expressed in this material are those of the authorsand do not necessarily reflect the views of the National Science Foundation.References[1] D. H. Uttal, and C. A. Cohen, “Spatial thinking and STEM education: When, why, and how?,” Psychology of Learning and Motivation, 57, 147–181, 2012. https://doi.org/10.1016/B978-0-12-394293-7.00004-2[2] L. G
mechanics courses are likely to proliferate in the coming years as the abrupt shifts toonline learning amidst the COVID-19 pandemic has prompted many students, faculty,departments, and institutions to revisit beliefs and assumptions about online courses. The authorsbelieve in the potential of hands-on models to support student learning in mechanics and hopethis paper will provide an opportunity to learn from our experiences and adapt other hands-onapproaches for online implementation.AcknowledgementThis material is based upon work supported by the National Science Foundation under grantnumbers DUE #1834425 and DUE #1834417. Any opinions, findings, and conclusions orrecommendations expressed are those of the authors and do not necessarily reflect
predictive effect on team assignmentperformance. Finally, the transition to remote learning (in the face of the COVID-19 pandemic)had a negative effect on student performance, and this negative consequence disproportionatelyaffected students who were already poor performers.IntroductionThe ability to work in teams has long been recognized as a critical skill for all engineeringgraduates as reflected in accreditation criteria specified by ABET [1]. Criterion 3, studentoutcome number 5 states that students must have: “An ability to function effectively on a team whose members together provide leadership, create a collaborative and inclusive environment, establish goals, plan tasks, and meet objectives.”As a result, there have been
mechanic courses. It is ourhope that the use of the CW this will make it easier for faculty members to implement the DCIin their courses, and for us to collect data on the instrument so we can improve it in the future.Disclaimer: The views expressed in this article are those of the authors and do notnecessarily reflect the official policy or position of the United States Air Force Academy, theAir Force, the Department of Defense, or the U.S. Government. Distribution A. Approved forpublic release, USAFA-DF-2020-27: distribution unlimited.References1. Gray, G.L., D. Evans, P. Cornwell, F. Costanzo, B. Self, “Toward a Nationwide Dynamics Concept Inventory Assessment Test,” Proceedings of the 2003 ASEE Annual Conference, Nashville, TN, June 2003.2
work throughproblems, and when they should rely on calculations to help adjust their intuition. This exercise hascertainly provided a moment of self-reflection for the authors and a direction towards improvement oftheir courses. Bibliography[1] D. Hestenes, M. Wells, and G. Swackhamer, “Force concept inventory,” Phys. Teach., vol. 30, no. 3, pp. 141–158, 1992.[2] C. Henderson, “Common Concerns About the Force Concept Inventory,” Phys. Teach., vol. 40, no. 9, pp. 542–547, 2002.[3] J. Docktor and K. Heller, “Gender Differences in Both Force Concept Inventory and Introductory Physics Performance Gender Differences in Both Force Concept Inventory and Introductory Physics Performance,” Am. Inst. Phys
with. A group of faculty in biochemistry at NC State has been working on aWordpress platform where lesson plans can be paired with 3D-printable designs for students toaccess. We propose to build a comparable site for Engineering Mechanics. Each lesson willinclude: • written explanations of the topic • a 3D CAD file using Fusion 360 where students can access the file, see how it's built, and edit it as needed • a brief video from Fusion 360 where the part spins or deforms • a 3D printer file so that faculty or students with access to a 3D printer can print their own demonstrations • a lesson plan describing a simple experiment to demonstrate the topic being discussed • reflection questions built around the
five topics: free body diagrams, equilibrium,equivalence, separation of rigid bodies, and friction. In this approach, students use a consistentmethod to draw free body diagrams, develop equilibrium equations, and solve the equations forunknowns. Conceptual warm-up exercises are used to assess student misconceptions in eachtopic and enhance their learning. Gardner and Jacobs19 developed a structural experience forstudents that help them to make abstract theoretical concepts that they learn in early stages morerobust. Embedded in this experience were strategies that reflected both ‘good teaching’ practiceand relevant management strategies. The authors have developed a case study withaccompanying worksheets that became the scenario for rich
“close” from the start. In our old sequence students were taught basicstatics and shear and moment diagrams together. Repetition has been one of the keys to oursuccess, so now that students have one complete term to grasp the concepts of centroids,reactions, and internal pins, they are better prepared and have a better chance at drawing correctshear and moment diagrams.The program is in its second year of implementation. So far, the results have been positive basedon student assessment/questionnaires and student passing rates. The students have identified theactivity sessions and model making exercises as key points in understanding the materialpresented in class. Student performance reflects this sentiment - failure rates have
paper authors willpresent the impact of utilizing the “adaptive follow-up” modules in Pearson MasteringEngineering, as well as a reflection on the different methods used over the study period.As in previous years, assessment of the efficacy of homework assignments will be based onobservation of students’ performance on exams, and a survey of students’ perceptions relative tohistorical norms. Institutional review of research protocol determined that full board review ofthe study and informed consent was not required.IntroductionOver the past 3 years the authors have been collecting and reporting data on homework, quiz,and exam performance, as well as survey data on students’ perceptions of learning and opinionson the methods used in the course
work mathematically and assume the slender rod rotates about Owith a rotational speed of 0.5 rad/s.Constructing an assessment rubric for student performancesBased on Wood’s problem-solving methodology ([2], [9]), data is collected from the student’sresponses to the open-ended homework problems on six of the seven steps – engage, define,explore, plan, implement, check, and reflect. Data on student engagement is collected from theresponse the students gave to a questionnaire. For brevity, the rubric for step 1 (studentengagement) and step 4 (planning) is indicated in Table 1 and 2 in Appendix 1. The data wascollected for each of the twelve open-ended homework questions and averaged at the end of thesemester.Besides, a second questionnaire is
using multiple exams to minimizeadverse effects to their GPA. However, having multiple exams meant that students took about oneexam each week. By the end of the semester this created some fatigue and related stress in thestudent population. Additionally, many students were still trying to rely on memorization ratherthan follow a process based on the Compass, so their stress grew as their grades reflected thatmemorization would not work.Forced practice and spaced repetition. Though it was not the primary intent, we realized midwaythrough the semester that the weekly exams had become an intense forced practice session. Duringexams, students would sit uninterrupted for 90 minutes once a week to work on course problems.The repetition and spacing of
presenting the material in a way that iseasily remember by students. With this in mind, the ABCD mnemonic device was developed. Inclass, it is presented in bullet format as shown: • Ⓐ – All Forces • Ⓑ – Body • Ⓒ – Coordinates • Ⓓ – Dimensions (Only for rigid bodies) ⇒ Ⓔ of Ⓔ – Equations of Equilibrium(Stated – Your A, B, C and sometimes D drives your E of E).While the order of the ABCD does not reflect the order generally followed when physicallydrawing the FBD, it does serve as a reminder to check that everything is included. Students areencouraged to always write the letters “ABCD” on their homework and exam papers.In class, the process for drawing an FBD is outlined as follows. Initially students must identifyan appropriate