Asee peer logo
Displaying all 10 results
Collection
2020 Fall ASEE Mid-Atlantic Section Meeting
Authors
Patricia Muisener, Stevens Institute of Technology (School of Engineering and Science); Gail P Baxter, Stevens Institute of Technology; Guillermo D. Ibarrola Recalde, Stevens Institute of Technology (School of Engineering and Science)
the content using a number ofdifferent strategies including peer to peer instruction, active learning and online resources andweekly quizzes to facilitate self-assessment and reflection.In this paper, we describe initial efforts to incorporate one type of metacognitive strategy (i.e.prompt students to think about and reflect on their learning and understanding of the content taughteach week) in the General Chemistry course. Key questions of interest include: What is the natureof student responses (conceptual or procedural)? Do responses vary by course week and/or gender?What is the relationship between student response (conceptual or procedural) and theirperformance on the exam?Research has demonstrated that active and collaborative
Collection
2020 Fall ASEE Mid-Atlantic Section Meeting
Authors
Vazgen Shekoyan; sunil Dehipawala, City University of New York, Queensborough Community College; Dimitrios S. Kokkinos, City University of New York, Queensborough Community College; Rex Taibu; George Tremberger Jr; Tak Cheung
”transference to “remotely doing a lab” would not be easy to assess during lockdown, when face-to-face practical final exams are impracticable to schedule. Assessment would certainly includegrading but grading alone would not provide an adequate holistic assessment. The constructionof an assessment rubric for the online experiential learning, based on the McGill University face-to-face experiential learning assessment principle concerning content-process mixture, bigpicture perspective and reflection is presented here. The advances in artificial intelligencesoftware as it pertains to online experiential learning are discussed.KeywordsAsynchronous online delivery, experiential learning, tacit and explicit knowledgeIntroductionThe online delivery of
Collection
2020 Fall ASEE Mid-Atlantic Section Meeting
Authors
Ping-Chuan Wang, State University of New York at New Paltz
Relevant DisciplineFigure 4. Students’ perspective on the discipline(s) that is/are more relevant to themicroelectronics field, surveyed at the end of the semester.Student Evaluation of Instruction (SEI) was conducted near the end of semester to assess theteaching effectiveness and learning experience. With 88% participation from the class, the SEIresult can be used as another reference for the course effectiveness. Regarding “teaching methodengaged my interest in the subject matter,” the course received a score of 4.93 on a scale of 1(strongly disagree) to 5 (strongly agree), with 14 students strongly agreed and one agreed. Withrespect to “instruction was effective,” it scored 5.00. When asked to reflect on their experiencein the course, all the
Collection
2020 Fall ASEE Mid-Atlantic Section Meeting
Authors
sunil Dehipawala, City University of New York, Queensborough Community College; Vazgen Shekoyan; Dimitrios S. Kokkinos, City University of New York, Queensborough Community College; Rex Taibu; George Tremberger Jr; Tak Cheung
assessment interms of social learning could be included during the COVID-19 lockdown. A reflection usuallycan start with “Why”. The 3-Whys: Why am I learning this? Why do people care that I learnthis, and why should I care? are good starts for improving self-awareness. “Think About YourOwn Thinking” is another deeper self-assessment metacognition strategy [13]. The auditoryprobabilistic thinking process in the induction reasoning upon the listening of MP3 media shouldbe reflected in the students’ answers [14]. Both formative and summative assessments are useful[15]. A weekly formative assessment may be able to grow a self-assessment mindset by thetenth week, and then a summative assessment at the fifteenth week would yield a holistic self-assessment
Collection
2020 Fall ASEE Mid-Atlantic Section Meeting
Authors
Mojeed Olamide Bello, MORGAN STATE UNIVERSITY; Nkiruka Jane Nwachukwu, Morgan State University; Ida Mougang Yanou N, Morgan State University; Niangoran Koissi, Morgan State University; Celeste Chavis P.E., Morgan State University; Oludare Adegbola Owolabi P.E., Morgan State University; Jumoke 'Kemi' Ladeji-Osias, Morgan State University
Tagged Topics
Diversity
], [2]. Conversely, theeffectiveness of hands-on learning can be reduced if there are inadequate levels of studentengagement and reflection [3], [4]. There are different learning settings in which a student canengage such as a laboratory, online classes, and through daily activities [5], [6], [7]. This studyshows how traditional labs can be transformed into hands-on labs by integrating USB-basedpersonal instrumentation used in electrical engineering. This approach is based on experimentalcentric pedagogy which integrates problem-based activities and constructivist-based instructionusing personal instrumentation that is designed to replace larger laboratory equipment [8]. Forthis project, the electrical engineering team supported each experiment
Collection
2020 Fall ASEE Mid-Atlantic Section Meeting
Authors
sunil Dehipawala, City University of New York, Queensborough Community College; Dimitrios S. Kokkinos, City University of New York, Queensborough Community College; Vazgen Shekoyan; Rex Taibu; George Tremberger Jr; Tak Cheung
learning assessment. The importance of scientific principle understanding inengineering education, described in the 2000 National Academy of Engineering Founder Awardgiven to Townes and the 2019 National Academy of Engineering Gordon Prize Innovation inEngineering and Technology Education given to Benkeser, should also be included in anassessment rubric [28. 29].The assessment consists of the three deliverables developed by McGill University experientiallearning team in terms of content-process mixture, big picture perspective, and reflection [30],and an additional deliverable on scientific data resolution related to engineering and technology.An assessment rubric example for experiential learning is listed in Table 1. Table 1: Assessment
Collection
2020 Fall ASEE Mid-Atlantic Section Meeting
Authors
Johannes Weickenmeier, Stevens Institute of Technology (School of Engineering and Science)
Tagged Topics
Diversity
memory andway to achieve desired understanding from our participants academic curriculum and aims at training andlearning outcomes [2]. engaging in creative and analytical activities.Participating, contributing,and reflection on research is particular powerful in building on basic knowledge acquired duringcourse work. This hold true in general but requires particular consideration and rethinking in termsof distance learning environments. The primary goal of our undergraduate student researchprogram was to engage students in active research and to provide a mentored experience forindependent research work. Due to a comprehensive COVID-related campus closure starting inMarch 2020, including a majority of the research labs, we
Collection
2020 Fall ASEE Mid-Atlantic Section Meeting
Authors
Jeremy David Paquin, United States Military Academy; Matthew Louis Miller, United States Military Academy; Jes Barron, U.S. Military Academy
for tackling such problems, we need tobe testing them with solving realistic problems under realistic conditions” [16]. In assessment of the student-provided note sheet, Weiman hypothesizes that “it isbeneficial to have students do activities that have them reflect on the course material and how itis organized, such as what they would do in preparing a cheat sheet, but that should not be madean either/or choice coupled with the kind of exams that we give” [16]. Weiman furtherhypothesizes that open-everything exams are better predictors of future success, stating that themore “exams resemble solving authentic problems in realistic environments, the moremeaningful measures they will be of how our students will be able to perform in
Collection
2020 Fall ASEE Mid-Atlantic Section Meeting
Authors
Jan Cannizzo, Stevens Institute of Technology
doingthis, instructors may wish to provide students with brief feedback on their work.Grading pre-class exercises, which can wait until after class, is again very straightforward:Students’ work should be graded for completion and seriousness of effort. A +/✓/− gradingscale should again be sufficient for this purpose. Since the pre-class exercises are thoroughlydiscussed in class, there is no need to provide feedback when grading. It thus takes seconds tograde each assignment.Post-class homework, which is intended to reflect the final level of achievement of each student,should be graded carefully for completion, correctness, and clarity of reasoning. Providingdetailed feedback is highly recommended.Conducting classTo prepare for class, the
Collection
2020 Fall ASEE Mid-Atlantic Section Meeting
Authors
Dov B Kruger, Stevens Institute of Technology (School of Engineering and Science); Gail P Baxter, Stevens Institute of Technology
continue to collect data in a number of classes as assessments are written, and attempt toquantify the effect of scaffolded assessments on learning to program in a number of courses.One problem administering these kinds of questions is that existing assessment software inlearning management systems tends to destroy the format of questions to the point wherestudents find the code incomprehensible. In order to be effective, assessments must be reliable,accurately reflecting the knowledge and skills they claim to test, and the software must allow fastand efficient generation of as many assessments as needed to give students practice solving newproblems, not merely memorizing answers from previous ones. Current tools such as Canvasactively obstruct