compared to ascertain the relative gains (if any) thatare directly attributable to the MILL model intervention, which is the objective of this work.Acknowledgement The work described in this paper was supported by the National Science FoundationIUSE Program under grant number DUE-1432284. Any opinions, recommendations, and/orfindings are those of the authors and do not necessarily reflect the views of the sponsors.References1. SME Education Foundation website: http://71.6.142.67/revize/sme/about_us/history.php2. Ssemakula, M.E. and Liao, G.: ‘Implementing The Learning Factory Model In A Laboratory Setting’ IMECE 2004, Intl Mech Engineering Congress & Exposition, Nov. 13-19, 2004; Anaheim, CA.3. Ssemakula, M.E. and Liao, G
charts (Plots A and C in both figures)reflect Anatomy course scores and the bottom bar charts (Plots B and D in both figures) reflectStatics course scores. Data is initially presented with regards to the MCT instruments applicationin a pre- and post-testing format for both classes and then data is presented for the PSVT:R in thesame fashion. Kurtosis and skewness will be discussed as relevant descriptive statistical data foreach bar chart and comparisons can then easily be made between the Anatomy and Statics preand post-performance on both instruments. A typical bell curve centered on the mean has beenprovided to aid visual confirmation of data normality.MCT ResultsBased on the pre-MCT results, the Anatomy course (Fig. 1, Plot A) had kurtosis
-college engineering programs to first-year engineering (Ph.D.). Purdue University, United States -- Indiana. 15. Turner, D. W. (2010). Qualitative Interview Design: A Practical Guide for Novice Investigators. The Qualitative Report, 15(3). 16. Kvale, S. (1996). Interviews: An Introduction to Qualitative Research Interviewing (1st edition). Thousand Oaks, Calif: SAGE Publications, Inc. 17. Walther, J., Sochacka, N. W. & Kellam, N. N. (2013), Quality in Interpretive Engineering Education Research: Reflections on an Example Study. Journal of Engineering Education, 102: 626–659. doi: 10.1002/jee.20029
compare different feedback structures, both visually(as a network and projected point) and through summary statistics that reflect theweighted structure of connections. The remainder of this section outlines the method ofENA. The details of how ENA was used to analyze the coaching sessions are provided inthe Results and Discussion section.To begin our ENA of co-occurrences of discourse elements (Table 1’s codes), we firstsubdivided the utterances of discourse into groups of utterances. These groups are calledstanza windows. The utterances within a window are assumed to be topically related. Inthis study, we examined conversations between students and coaches where students andcoaches are responding to each other’s previous discourse. As a result
centuryinstruction is process-oriented, evaluation of instruction can thus reflect a process-orientedschema to more clearly reflect that under evaluation.30 The field of engineering education needsmore contextually relevant evidence-based research about evaluation methodology for GBL.Adding the results of this study to the literature base can help bridge educational researchmethodology and actual practice of GBL for engineering education. The authenticity of oureducation research methodology has wide applicability for engineering education researchersdesiring to assess the effects of GBL unobtrusively on students’ learning while doing.2. Problem statementA critical component missing in education research literature are methods to reliably andcredibly
, thestakeholders did not add information to fill any gaps in the information they provided during theinterviews.LimitationsOur primary limitation was the small number of people interviewed at each site and, therefore,we may not have saturated the data set. However, our participants did include the key personnel,by title, at each location (e.g., director of career services and engineering liaison). In addition, areview of our findings with stakeholders at each site demonstrated that the themes developedaccurately reflect our two case sites. Finally, our participants were subject matter expertsregarding student career services for their respective universities.ResultsWe organized our results by case site and then compared the sites. The results for each
order to optimize the classification effort while attempting toinform us of feedback activity nature and level. For example, we recognize the importance ofneed analysis and the emphasis that experts place on this stage verses novices, and so theimportant coding classifications of problem identification, representation and communication areprominent in our model. Additionally, the verification classification is available at each stage, asthis reflects best design practice. Figure 1. A generalized engineering design process model with coding classifications Initiating Planning
. Holly Matusovich for contributing to this study. Also, this material isbased upon work supported by the National Science Foundation (NSF) as a Graduate ResearchFellowship. Any opinions, findings, and conclusions in this material are those of the authors anddo not necessarily reflect the views of the National Science FoundationReferences:1. M.Gläser-ZikudaandS.Järvelä,Applicationofqualitativeandquantitativemethodstoenrich understandingofemotionalandmotivationalaspectsoflearning,Internationaljournalof educationalresearch,47(2),2008,pp.79-83.2. K.E.Winters,H.M.Matusovich,M.S.Brunhaver,H.L.Chen,K.YasuharaandS.Sheppard,From FreshmanEngineeringStudentstoPracticingProfessionals:ChangesinBeliefsaboutImportant
perceivethemselves to fit into a given group, in this case engineering,5 which in turn affects how theyprogress along the academic and career path in their field.6The engineering identity framework utilized in the study is partially based off a physics identitymodel composed of four basic factors: performance, competence, interest, and recognition.5,7Performance describes a student’s belief in their ability to perform in their classes or whenconducting engineering tasks.8 If a student performs poorly in class, they are less likely toidentify themselves as an engineer. Competence describes a student’s belief in their ability tounderstand engineering material, which is often similarly reflected in a student’s performance inclass.8 Interest describes how
. [4] Organization for Economic Co-operation and Development. (2005). Definition and Selection of Competencies (DeSeCo) Project. Retrieved from http://www.oecd.org/education/skills-beyond-school/41529556.pdf [5] Williams, J. (2002). The engineering portfolio: Communication, reflection, and student learning outcomes assessment. International Journal of Engineering Education, 18(2), 199–207. [6] Boiarsky, C. (2004). Teaching engineering students to communicate effectively: A metacognitive approach. International Journal of Engineering Education, 20 (2), 251–60. [7] Gömleksi˙ z, M. N. (2007). Effectiveness of cooperative learning (jigsaw II) method in teaching English as a foreign language to
inverted sections with those in control sections (i.e., traditional coursemodel). Treatment and control students completed the same measures (e.g., content assessmentsand student attitude surveys) and faculty members, who taught in both conditions, alsocompleted reflection papers related to their experiences. The guiding research questions for thestudy and an overview of the assessment measures are shown in Table 1 below (more details onassessment measures are included in a subsequent section of this paper). In the final year of thestudy, the researchers designed what they felt were “best practices” for the inverted model in allsections of their courses and the same outcome measures were used.Table1.EvaluationQuestionsandOutcomeMeasures
. However, there were participants across a variety of ethnicities and from all studentclassifications, including graduate students. Other majors represented in the sample wereMechanical Engineering, Construction Science, Petroleum Engineering, and various otherEngineering programs. Data on handedness was also gathered and 12.9% (n=22) of theparticipants were left-handed which is reflective of the population as a whole. A summary of thedemographics of the participants is found in Table 3. Table 3: Demographic information Total Number of Participants: N = 170 Student Gender College Major Ethnicity
Feedback: A Learning Theory Perspective, Educational Research Review 9, 1-15.5. Quinton, S., and Smallbone, T. (2010) Feeding Forward: Using Feedback to Promote Student Reflection and Learning–A Teaching Model, Innovations in Education and Teaching International 47, 125-135.6. Narciss, S. (2008) Feedback Strategies for Interactive Learning Tasks, Handbook of Research on Educational Communications and Technology 3, 125-144.7. Creasy, M. A. (2015) Data Extraction from Web-Based Learning Management Systems, In Illinois-Indiana ASEE Conference, Forty Wayne, Indiana.8. Creasy, M. A. (2014) Hybrid Class Experiences: Flipping Mechanics Courses and Homework Feedback, In ASEE Illinois/Indiana Section
. Describe future research directions 7A. Outline ‘next steps’ or future work 7B. Suggest methodological improvements 8. Engage in learning 8A. Appropriately connect/use course concepts in the investigation process 8B. Identify/reflect on “lessons learned” 8C. Manage time and resources effectively to complete the investigationIn problem analysis, the student displays the ability to: 1. Define the problem 1A. State the problem in their own words 1B. Identify primary problem goal(s) 1C. Characterize the type of problem and the type of solution sought 1D. Represent the problem visually (e.g., free body diagram, circuit schematic) 1E. Identify known information 1F. Recognize
Engineering Education.21. Huff, J. L., Smith, J. A., Jesiek, B. K., Zoltowski, C. B., Graziano, W. G., & Oakes, W. C. (2014). From methods to methodology: Reflection on keeping the philosophical commitments of interpretative phenomenological analysis. 2014 IEEE Frontiers in Education Conference (FIE) Proceedings.22. Smith, J.A., Flowers, P., Larkin, M. (2009). Interpretative Phenomenological Analysis: Theory, Research, Practice. London: Sage.23. Godwin, A., Potvin, G., Hazari, Z., & Lock, R. (2013). Understanding engineering identity through structural equation modeling. 2013 IEEE Frontiers in Education Conference (FIE).24. Hazari, Z., Sonnert, G., Sadler, P. M., & Shanahan, M. (2010). Connecting high school physics
provided strong evidence of validity for the EPRA tool from someof the interview cases examined.AcknowledgementsThis material is based on work supported by the National Science Foundation under Grant#1158863. Any opinions, findings, and conclusions or recommendations expressed in thismaterial are those of the author(s) and do not necessarily reflect the views of the NationalScience Foundation.Bibliography1 ABET, "Criteria for Accrediting Engineering Programs Effective for Evaluation During the 2015-2016 Accreditation Cycle," ABET Engineering Accreditation Commission, Baltimore, MD, 2014.2 L. J. Shuman, M. Besterfield-Sacre and J. McGourty, "The ABET "Professional Skills" - Can They Be Taught? Can They Be Assessed?," Journal of
) between students’ use of representations in each stageand the respective scores. One exception was the configuration step, where the score was well correlatedwith the amount of representations used. The correlation resulted in a value of .52 (p-value = .002) for theconfiguration step. To evaluate the effects of each type of representation on this score, we performed amultiple linear regression. Equation (1) describes the model used to predict students’ score on theconfiguration step (SC) based on the numbers of images, plots, tables, equations, calculations, and charts.Results reveal a significant effect of the use of equations on this stage (p-value < 0.016). No other type ofrepresentation had significant effect. This fact could reflect the
to which respondents indicate their level of agreement on a Likert four-pointscale, from strongly agree to strongly disagree. Participants respond to the 26 items for each ofthe three classroom strategies (formative feedback, real-world applications, and initiatingstudent-to-student discussions), thus yielding 78 datum points.VECTERS additionally contains questions to collect demographic information about theinstructors as well as general information about the engineering course they are reflecting uponwhen responding to VECTERS. Instructor information includes information such as gender,ethnicity, and years of experience. Course information includes items to indicate the course-level(100 to 400), whether the course is required, and the
reliability of each survey tool. To establish content validity,the NSSE relies on a panel of experts and uses student self-report data.31 In terms of reliability,NSSE has a reported value of 0.70 or higher for deep learning which includes higher-order,integrative, and reflective learning items.32 Reliability values close to or above 0.70 are generallyconsidered acceptable in statistical analysis.33 In terms of response process validity, NSSE usedcognitive interviews and focus groups to determine that the survey was valid for students ofdifferent races/ethnicities.34 ECAR has not published information on the validity or reliability ofits questionnaires.Since the present study relied on a newly constructed assessment tool, a panel of experts wasused to
aremore frequently placed in the role of a passive spectator, it can often be difficult to get studentsto participate in class1-3. Despite some of its drawbacks and difficulties, discussion can also be used as a tool foractive learning when applied in an online discussion forum. During discussion, participants havethe opportunity to interact and collaborate with one another to fulfill and meet their learningneeds8. Furthermore, moving discussion to an online venue has several advantages. First,instructors and students have the convenience of being able to add to a discussionasynchronously. They have the time to reflect on discussion prompts and to formulate a well-thought out response. Second, online discussions can increase the amount of
psychographic measures developed in this study reveal nuances in student values ofsustainability and global citizenship, highlighting the importance of constant revision ofeducators’ understandings of student understanding in order to graduate informed and dedicatedstudents who will engage in, design for, and implement sustainability in their future careers.AcknowledgementsThe authors would like to gratefully acknowledge the National Science Foundation for theirsupport of this work under the TUES program (grant number DUE-1245464). Any opinions,findings, and conclusions or recommendations expressed in this material are those of the author(s)and do not necessarily reflect the views of the National Science Foundation.Bibliography 1. Beane, T.P., &
. Written communications - Delivering effective written communications, including creating engineering documents such as reports, case studies, memos, and minutes of meetings. How to write, manage, and respond to emails is also a focus of this module, as well as the use of social media. 4. Listening - active listening techniques such as paraphrasing, clarifying, and reflecting. 5. Visual communications - How to create an effective visual image via a diagram, drawing, or poster. 6. Nonverbal communications
approach is that, by being alerted by students toproblematic language or missing content before the rubric’s application, the researchers can meetto address the issue by making appropriate modifications to the rubric in question.Engineering Education MajorsThe engineering education majors were asked to write reflective essays at the conclusion of thesemester. In terms of the positive effects that the rubrics had as a formative assessment tool, oneengineering education major mentioned the following: “We have made many changes that have improved this project for us and for the all the students involved. The communication level has been much better this year and we have been able to help improve the quality of the
participate reflected the demographic of the Faculty, a purelyserendipitous occurrence. Of the 22 participants there were five students who were not visibleminorities in engineering, nine students who appeared to be English dominant and seven whowere female. None of the teams investigated in this paper consist of all monolingual Englishspeakers, and only one team, Team 4, consisted of all domestic students. The language diversityof the teams was representative of the University’s (and in particular the Faculty’s) linguisticdiversity. Given the demographics of the teams and the student population in this course, theprobability of having teams volunteer that did not have similar diversity to the student body wasminimal. The students’ motivations for