, University of Missouri, Columbia Rose M. Marra is a Professor of Learning Technologies at the University of Missouri. She is PI of the NSF-funded Supporting Collaboration in Engineering Education, and has studied and published on engineering education, women and minorities in STEM, online learning and assessment. Marra holds a PhD. in Educational Leadership and Innovation and worked as a software engineer before entering academe.Mr. Nai-En TangDr. David H Jonassen, University of Missouri, Columbia Dr. David Jonassen passed away in December 2012; he was a Curators’ Professor at the University of Missouri where he taught in the areas of Learning Technologies and Educational Psychology, Dr. Jonassen was the PI of the NSF
Paper ID #10435Use of Online Assessment and Collaboration Tools for Sustainable BuildingPractices CourseDr. Rui Liu, The University of Texas at San AntonioDr. Yilmaz Hatipkarasulu, University of Texas at San Antonio Page 24.1311.1 c American Society for Engineering Education, 2014 Use of Online Assessment and Collaboration Tools for Sustainable Building Practices CourseAbstractIn the last decade, sustainable building and green construction practices became an importantpart of construction industry. The
multidisciplinary teams as specified in the ABETEngineering Accreditation Commission Student Outcome (d) an ability to function onmultidisciplinary teams. This paper presents an experience of using a team-based case studyproject as an active learning tool in the EE and CS required course for assessing the attainmentof this student outcome. The performance indicators clearly demonstrate that the ABETEngineering Accreditation Commission Student Outcome (d) is successfully attained.I. IntroductionSince the ABET Engineering Criteria 2000 accreditation, efforts to satisfy Criterion 3(d) anability to function on multidisciplinary teams have resulted in a large literature on the topics ofteam-based learning,1 collaborative learning,2 learning organization,3
have been set. They conclude that future research should collect data concerning theextent to which feedback recipients set goals after receiving multisource feedback. The researchdesign and empirical model I employ in the current study are based largely on the precedingtheoretical model and research recommendations. For instance, I assess participants’ reactions toreceiving conscientiousness feedback as well as the extent to which they are subsequentlymotivated to set and pursue self-development goals. Using structural equation modeling (SEM)techniques, I examine direct and indirect effects among these and other key variables in themultisource feedback process.Summary The unique human capacity for self-understanding, while
given. Rasila et al.7detailed some of the benefits of an onlineassessment tool for engineering mathematics, including improved feedback to students.Similarly, Chen et al.8 showed how lecture content could be guided by electronic conceptualquizzes that were assigned during lecture. This form of student engagement produced asignificant increase in student performance and enabled the professor to rapidly assess students’misconceptions.Van Arsdale and Stahovich9 examined features which characterized the temporal and spatialorganization of students’ solutions to exam problems in a mechanical engineering statics course.These features were then used to predict students’ performance on individual exam problems. Incontrast, we examine features that
Paper ID #6505Process Analysis as a Feedback Tool for Development of Engineering Prob-lem Solving SkillsDr. Sarah Jane Grigg, Clemson UniversityMrs. Jennifer Van Dyken, Clemson UniversityDr. Lisa Benson, Clemson University Lisa Benson is an Associate Professor in the Department of Engineering and Science Education at Clem- son University, with a joint appointment in the Department of Bioengineering. Dr. Benson teaches first year engineering, undergraduate research methods, and graduate engineering education courses. Her re- search interests include student-centered active learning, assessment of motivation, and how
Paper ID #7325Quantitative Asessment of Program Outcomes Using Longitudinal Data fromthe FE ExamDr. Joe C. Guarino, Boise State University Professor of Mechanical and Biomedical EngineeringProf. James R Ferguson P.E., Boise State University Associate ProfessorDr. V Krishna C Pakala, Boise State University Page 23.1013.1 c American Society for Engineering Education, 2013 Quantitative Assessment of Program Outcomes Using Longitudinal Data from the FE ExamThere have been many studies
D80 Center which offers contribution-based learning, research, and service opportunities for students with the poorest 80% of humanity. Dr. Paterson is a noted educator, workshop facilitator, and public speaker on community engagement, and leads several initiatives for learning engineering through service, recently leading ASEE’s newest division, Community Engagement in Engineering Education. He is PI on several research projects assessing the impacts of community engagement on students, faculty, and communities around the world.Dr. Chris Swan, Tufts UniversityDr. Olga Pierrakos, James Madison University Dr. Olga Pierrakos is an associate professor and founding faculty member of the James Madison Univer- sity
of mathematics andengineering science, accompanied by laboratory and workshop experiences. The formative yearsshould be devoted to individual learning, followed by team activities and peer group interactions,and then immersion in creativity and innovation in the workplace, e.g. research participation.Some global trends are evident in engineering education over the past two decades: Page 23.1174.31. Global adoption6,7,8,9 of the ABET2000 model of self-assessment processes as the basis for accreditation of undergraduate programs, where showing “improvement” replaces standards.2. Uncritical adoption of the US K-12 model of teaching
teachprofessional skills, helping students develop these skills is more difficult than it may seem. Manyeducators view professional skills as important aspects of practice. However, there is sometimesresistance from engineering students and educators to emphasize these skills in the curriculum.There are many reasons engineering faculty still struggle with teaching these skills. Cajander etal. suggest “that many educators have an intuitive grasp of what professional skills are, butstruggle to give a clear definition of them and to define rubrics for their assessment. (p. 1)” 20Other reported reasons from computer science include limited room in the curriculum, lack ofexperience or familiarity with professional skills, and a view that professional skills are
regression analysis results for both the overall-effort and per-problem modelsprovide correlations that are much stronger than those found in prior work. As mentioned earlier,those studies typically relied on either the students or their parents to report the amount of time Page 23.1311.12spent working on each homework assignment. The Livescribe™ digital pens provide a morereliable measure of the amount of time students spend on their homework assignments whichmay account for the higher coefficients of determination we obtain.ConclusionIn this paper, we have presented novel, data-driven methods for assessing students’ homeworkhabits in a Mechanical
of the course.Assessment of the courseTo assess the success of the course in delivering the course outcomes to students, it is necessaryto rely on direct student feedback from end-of-term course evaluations. For the purposes of thispaper, ratings and comments will be presented that deal specifically with how well the coursepromoted technological and socio-cultural understanding. The rating scale for each question isfrom 0-4, with „0‟ meaning “Strongly Disagree” and „4‟ meaning “Strongly Agree”, as well as anoption for “Not Applicable (N/A)”. Students submitted these evaluations during the final weekof the course, and all responses are anonymous.Before addressing the evaluation scores, it should be noted that between Fall 2010 and Fall
these characteristics,engineering literature was examined through involving ways to measure creativity and novelty inengineering. Charyton et al.2 looked at the Creative Engineering Design Assessment (CEDA)and proves the reliability of the test. The article points out that the test is evaluated based onthree criteria informing the reader what types of linguistic qualities to look for in the analysis.The criteria are originality based on introduction of new ideas, fluency based on the amount ofnew ideas, and flexibility based on the number of different types of ideas.2 The types of linguisticcategories that could possibly indicate creativity, based on information from this article, includeuniqueness and frequency of unique versus common words
writing, othershave observed[3],[4] that college faculty fail to agree on how to define good writing and thus onhow to promote and assess it. Even within engineering education, faculty do not all share thesame view of writing literacy.[5] In the interest of facilitating more a meaningful synthesis, wefocus our literature review primarily on the archival literature of the field of engineeringeducation research. Here we find only a modest number of empirical studies seeking to describeor assess the state of engineering student writing abilities. In 2008, Paretti[6] observed that mostengineering education scholarship, at least in the area of technical communication in designcourses, “has focused on describing course assignments and strategies for
Paper ID #41917Giving Voice to Problem-Solving: Hearing Students’ Techniques in VideoReflectionsDr. Tammy VanDeGrift, University of Portland Dr. Tammy VanDeGrift is a Professor and Chair of Computer Science at the University of Portland. Her research interests include computer science education, pedagogy, and best practices for retention and engagement. ©American Society for Engineering Education, 2024 Giving Voice to Problem-Solving: Hearing Students’ Techniques in Video ReflectionsAbstractWritten exams are regularly used to assess students’ skills in problem-solving
Paper ID #44303Reflections on 10 years of Operating a Computer-based Testing Facility: LessonsLearned, Best PracticesDr. Jim Sosnowski, University of Illinois Urbana-Champaign Jim Sosnowski is the Assistant Director of the Computer-Based Testing Facility (CBTF) at the University of Illinois Urbana-Champaign.Dr. Julie M Baker, University of Illinois Urbana-Champaign Julie Baker is a Learning Design Specialist for the Applied Technologies for Learning in the Arts and Sciences (ATLAS) group in the College of Liberal Arts and Sciences (LAS). She helps LAS faculty implement best practices for computer-based assessment and
Paper ID #41930Improving Efficiency and Consistency of Student Learning Assessments: ANew Framework Using LaTeXDr. Ira Harkness, University of Florida Ira Harkness is an Instructional Assistant Professor in the Department of Materials Science and Engineering. He has two decades experience in higher education, including directing information technology and facilities efforts at UF, and working with non-profits and community organizations to address K-12 education. His expertise is in computational nuclear engineering and nuclear engineering education.Prof. Justin Watson ©American Society for
placessignificance on the long-term influence of students' education as future professionals [1, p. 4],[2,p. 3]. ABET mandates engineering programs to articulate their individual “program educationalobjectives,” which essentially serve as "comprehensive statements outlining the achievementsgraduates are anticipated to reach within a few years post-graduation" [1, p. 4],[2, p. 3].Numerous U.S. programs incorporate teamwork effectiveness and clear communication leadingto successful team outcomes in their “program educational objectives.” While ABET necessitatesa thorough evaluation and assessment process for attaining student outcomes (short-term), it doesnot impose a similar process for assessing the long-term impact on program graduates as theytransition to
Research Fellow. He has developed Five Simplified Integrated Methods of Solution (SIMS) for his book on ”Essential Engineering Mechanics” and is working on Integrated Instruction, Learning and Assessment (IILA) Software for ”Education with Excellence” so that even an initially failing student can eventually get an A Grade, with Correct Answers to all questions in every quiz, test or exam. At present, he is working on a Five Fold Plan for Enhancing Student Performance in Engineering Me- chanics using Mathcad Interactive Tutorial Assessment.Dr. Ramalingam Radhakrishnan, Prairie View A&M University Dr. Ramalingam Radhakrishnan is a professor in the Department of Civil & Environmental Engineering at Prairie View
: (1) Water quality analysis; (2) Lake front development and remediation (3) Development of MOOCs; (4) Accreditation, academic quality framework and academic auditing; (5) Learning Spaces – Blended approach; (6) Active and experiential learning; (7) Sustainable Development and Education; (8) Urban Environment Management and Smart city; (9) solid and hazardous waste management and landfill engineering; and (10) life cycle assessment and sustainable construction materials. His research and train- ing programme is funded by the ITEC, DST, World Bank, MEA, MoE, PWD and several prominent state governments and industries. Dr. Jana published around 50 research articles in international and national journals and conferences
Paper ID #36903A Feasibility Study of Spatial Cognition Assessment in Virtual Realityfor Computer Aided Design StudentsDr. Ulan Dakeev, Sam Houston State University Dr. Ulan Dakeev is an Associate Professor in the Engineering Technology Department at Sam Houston State University. His areas of research include Virtual & Augmented Reality, renewable energy (wind energy), quality in higher education, motivation, and engagement of students.Dr. Reg Recayi Pecen, Sam Houston State University Dr. Reg Pecen is currently a Quanta Endowed Professor of the Department of Engineering Technology at Sam Houston State University in
Paper ID #37385A Rubric-Based Assessment of Information Literacy in Graduate CourseTerm PapersDr. Bridget M. Smyser, Northeastern University Dr. Smyser is a Teaching Professor in the Mechanical and Industrial Engineering department at North- eastern University.Jodi Bolognese, Northeastern University Jodi Bolognese is the Engineering Librarian at Northeastern University, where she serves as liaison to the College of Engineering. Previously, she worked in product management for STEM learning technologies. ©American Society for Engineering Education, 2023 A Rubric-Based Assessment of
Paper ID #38437Assessment of a Hybrid Research Experience forUndergraduates Program During the COVID-19 PandemicJeremy Straub (Dr.) - © American Society for Engineering Education, 2022 Powered by www.slayte.com Assessment of a Hybrid Research Experience for Undergraduates Program During the COVID-19 PandemicAbstractThis paper reports on the fourth year of a cybersecurity-focused research experience forundergraduates programs site in the summer of 2021. Due to the COVID-19 pandemic, the siteoperated in a hybrid mode during this summer, after operating entirely virtually during
Session F2A4 Research with an Undergraduate Student: Using Entropy to Assess the Training of a Neural Network G. Beate Zimmer, Jeremy S. Flores and Alexey L. Sadovski Department of Computing and Mathematical Sciences Texas A&M University – Corpus Christi Philippe E. Tissot Department of Physical and Life Sciences Texas A&M University – Corpus Christi AbstractThis paper reports on enhancing undergraduate
Paper ID #40135Board 295: Five Year Assessment for Educating Diverse UndergraduateCommunities with Affordable Transport EquipmentZeynep Ezgi Durak, Washington State University Zeynep Durak is a graduate research assistant in the at Washington State University. She is working on the design and development of low-cost miniaturized hands-on learning tools to demonstrate heat transfer and fluid mechanics concepts. Specifically she is working on the development of a fluidized bed desktop learning module and its associated learning materials.Prof. Bernard J. Van Wie, Washington State University Prof. Bernard J. Van Wie received
Graduate Academy for Teaching Excellence Fellow, a Global Perspectives Fellow, a Diversity Scholar, a Fulbright Scholar, a recipient of the NSF CAREER award, and was inducted into the Bouchet Honor Society. Homero serves as the American Society for Engineering Education (ASEE) Chair for the Commission on Diversity, Equity, and Inclusion (CDEI), the Program Chair for the ASEE Faculty Development Division, and the Vice Chair for the Research in Engineering Education Network (REEN).Matthew A. Witenstein, University of DaytonJeanne Holcomb, University of Dayton ©American Society for Engineering Education, 2023 Assessing Global Engagement Interventions to Advance Global Engineering
worked in consulting in the private sector and as an analyst in the U.S. Government. He’s earned master’s degrees in business administration and international affairs. ©American Society for Engineering Education, 2023 Assessing levels of psychological safety and teamwork satisfaction in engineering senior capstone teamsAbstractDeveloping a team into a learning organization has been shown to create high-performing teams.Amy Edmondson's work showed that forming a learning organization requires a psychologicallysafe environment. The current research comes from studies into industry and professionalorganizations, but there is little work showing if teams of university students are
. She is a member of the Human Factors and Ergonomics Society (HFES). Prior to her academic career, she spent more than 10 years ad- vising Fortune 500 clients on the design of customer interfaces at Deloitte Consulting and Morgan Stanley & Company.Sabrina J. Bierstetel, Franciscan University of Steubenville ©American Society for Engineering Education, 2023 Assessing Resilience as a Virtue in Learners: Development of a New Instrument for Academic Resilience.Abstract:Resilience is a learner disposition that serves as an aspect of the virtue of fortitude. Whilemany measures exist that examine resilience, few do so in an educational context.Existing scales of academic resilience