while the instructor and teaching assistants work around the roomto help the student pairs whose progress is slowed by wiring or measurement errors.The students summarize their observations and data measurements in a report they complete each weekafter the laboratory session. We encourage them to reflect on what they learned by completing thelaboratory and what they would improve if they had to do a similar design for a different goal.Grading of the laboratory work emphasized the importance of the preparation and preliminary report(50% of laboratory grade) in comparison with the post-lab report (30%). Questions asked of the studentsat the end of each laboratory to check their involvement in the laboratory counted for 20% of thelaboratory
thedepartment. To help with minimizing the potential for violation of academic integrity and toencourage students to reflect on their proposed solutions, they were asked to prepare a screencastand verbally explain how they solved the problem in addition to submitting their writtensolutions.As discussed in our previous study1, the changes we applied to the course had a promising effecton students’ performance in this course and a positive effect on their final exam grades. Inaddition, in the mid-quarter and end of quarter surveys in spring 2018, students cited the benefitsof offering the lecture content in the video format including the opportunity to review thematerial before and after class and having extra practice and discussion time in class. In
Oppression (2018)2 have discussed thephenomenon.There are many reasons why an algorithm may be considered “biased.” Incomplete or faulty datais one reason. For example, in a study published in Nature Communication3 researchersconfirmed for the first time that two of the top genomic databases, which are in wide use todayby clinical geneticists, reflect a measurable bias toward genetic data based on European ancestryover that of African ancestry. This deficit in African ancestry genomic data was identified duringan 18-month long study conducted via the Consortium on Asthma among African-AncestryPopulations in the Americas (CAAPA). When compared with current clinical genomicdatabases, researchers found a clearer preference in those databases for
. All three factors reflect studentcompetence with the course material, and the course grade thus provides a measure of overallstudent performance in the course.The decision to use course grades as a measure of student performance was followed by adecision to exclude failing (F) grades while making pre and post OER comparisons of studentperformance. F grades are commonly received by students facing personal problems; theyappear semi randomly in some terms, and inclusion of these grades can skew class GPA whenthe number of students is small. Figure 3 shows the grade distributions of the 35 pre-OER and38 post-OER students: the x axis lists each letter grade and the corresponding grade point. Thefigure shows that the post-OER group earned a higher
and BackgroundDespite decades of targeted effort and resources, women remain dramatically underrepresentedin engineering fields (Yoder, 2012) and this underrepresentation can lead to a number ormarginalizing experiences. Researchers have demonstrated the ways in which masculine normsand values are reflected in engineering practice and therefore code the discipline as male(Dryburgh, 1999; Secules, 2019). At the same time, technical/social dualisms map intomale/female binaries in ways that inform and support beliefs about what counts as engineeringwork and what is peripheral to the practice (Faulkner, 2000, 2007). These factors combine towhat amounts for an unwelcoming or chilly climate for women in most engineering fields(Ambrose, Bridges
data. The incorporation of direct and indirect tools wasnecessary to better assess the development of the students' communication skills as well as groupinterpersonal skills [3] [4]. The direct assessment was used in evaluating measurable tasks suchas meeting deadlines, establishing goals, and meeting objectives. At the same time, the indirectassessment was more suitable in assessing students' ability to work productively with others,their leadership skills, and communication skills [6]. Finally, a set of rubrics was developed todescribe the student’s performance level and summarize the assessment’s results. The rubricswere generated and organized to directly measure and reflect the students’ mastery of eachoutcome using a variety of
. flipped a transportationengineering course and used questionnaires and class video recordings to show students had apositive view toward the change. The more broadly defined, blended learning method combinesface-to-face interaction with online tools in a general sense. In order to better teachentrepreneurial skills to students, Sidhu et al. incorporated a mock startup company course whichtakes students from concept to low tech demo. By shifting focus away from the time consumingtechnical details, more teamwork, self-reflection, and inductive learning could be taught. In a verydifferent approach Weaver et al. used a series of case studies of existing startups to give students amore holistic view of what it takes to bring an innovation to market
reflecting on the Introduction to the Internet of Things course offered three times sofar at SAVC, the authors consider the following lessons learned not only worth sharing withinstructors of a similar course, but could also be beneficial to those teaching engineering coursesin a non-English-speaking country. • The topics to be covered must be selective within the three week intensive course format. The purpose of this course is to expose students to the key concepts of IoT and stimulate their interest in learning more advanced courses, rather than helping them gain thorough understanding of the technical details. • Student engagement and motivation to continue learning was substantially greater after the focus
future classes with thatinstructor they would worry it might be reflected in their grade. (That said, the one instructorthey had that was inappropriate with comments during my time at this school was let go the nextterm after the comments were reported. The comments were passed on by a small group ofstudents but as more students heard that someone had been brave enough to report the instructor,more students came forward. The emergency had been defined and visible action was taken). Iasked if they would go to the chair and, again, the answer was “NO”. They were again worriedabout being found out and said they would just grin and bear it. We are not able to be in theclassroom for every class of every instructor but we do have meetings with our
assist in developing, implementing, and runningthese forms/surveys for you.Next StepsFall 2019 provided another opportunity to reflect on our program assessment process (see ourassessment schedule in Table 3). Based on our experience during the 2018-2019 cycle, wedecided to undertake the following adjustments in preparation for our Spring 2020 programassessment: 1. Instead of each individual faculty selecting an ACM CS2013 learning outcome to use as a PI for the student outcome, the department agreed to vote on a set of accepted PIs for each student outcome. The intent is for the faculty to select a PI from this department- approved set. The Assessment Committee will coordinate this PI selection process to ensure sufficient
limitations impact thefindings of our work. First, as has been mentioned, at this stage in the research, we haveexamined only a subset of our total dataset. As we describe in the future work section, this workwill inform further analyses.3 ResultsIn the preliminary analysis presented in this paper, 30 student survey responses were analyzedand a total of seven content features, one layout feature, and two benefits were identified in thesurvey results. The results presented in this paper categorizes the data gathered based on CodingFramework presented in Section 2.3. In addition, the conclusions drawn are based from thispreliminary analysis, and as such may not reflect complete student sentiments of support sheets.Future work will incorporate all 227
the seven principles ofgood feedback practice7. The quizzes 1) helped clarify what a good performance was, 2)facilitated the development of self-assessment (reflection) in learning, 3) delivered high qualityinformation to student about their learning, 4) encouraged teacher and peer dialogue aroundlearning, and 5) provided information that we could use to modify our teaching. The studioformat and flipped nature of the course were key to supporting these basic feedback principles.Experiment ResultsThe most significant effect of the latest method of flexible assessment was seen in its impact onthe final overall course grade and one of the final exams. Table 2 shows the lab and lecture finalexam averages from the previous (Spring & Fall 2018
% 39% Yes No No 52% 61% 78% Figure 3: Association with minority groups of the 23 study participantsWith the demographic context provided by Figure 2 and 3 in mind, the main result of our studyso far is the master codebook itself, as shown in Table 2. The codebook follows the hierarchicalstructure depicted in Figure 1, and is divided into six topics: engineering discipline, engineeringexperience, engineering connection, support for success (during college), obstacles anddeterrents (during college), and reflection on engineering identity. Within this