Assessing the Impact of New Teaching Methods by Predicting Student PerformanceAbstractMany teachers try new things in the classroom with the intent of making learning more effective.In most cases, assessment of the impact is anecdotal; the teacher surveys the students about thenew technique and draws conclusions based on their feedback. In order to more definitivelyprove the impact, better assessment tools are needed. In a recent study, the authors attempted topredict performance in a course and then measure the improvement due to a major change in theavailable resources for study outside the classroom in our fundamentals of engineering course.To measure the effectiveness, we used the GPA of the students at the start of the semester
2025 ASEE Northeast Section Conference, March 22, 2025, University of Bridgeport, Bridgeport, CT, USA. Assessing the Impact of Data Augmentation on Underwater Image Super Resolution Nusrat Zahan Sidike Paheding School of Engineering and Computing School of Engineering and Computing Fairfield University Fairfield University Fairfield, CT Fairfield, CT nusrat.zahan@student.fairfield.edu
Missouri Instructor Survey Assessment of Project Lead The Way Programs Stuart W. Baur, Ph.D., A.I.A. and R. Joe Stanley, Ph.D.AbstractThere have been several studies that show the benefits of Project Lead The Way (PLTW) coursesfor K-12 students in the preparation for high school students on statewide and national exams,high school academic performance, college level academic performance in particular areas ofstudy, high school student engagement, and other areas. The challenge is how are schools beingprepared to attract students to such programs and are the students excited about the curriculum.This study examines the impact of PLTW courses at the middle through high school level.Survey results from 208 instructors who
Promoting Critical Thinking during Problem Solving: Assessing Solution Credibility Charles E. Baukal University of Tulsa and Oral Roberts UniversityAbstractEngineering students are considered novices while their instructors are experts in a given field.One of the goals of engineering education is to move students closer to being experts. Engineersare problem solvers by nature and an important skill to be learned is the ability to assess thecredibility of solutions. Engineering educators can help students improve this ability bymodeling solution assessment in the classroom by predicting, where possible, what the solutionshould look like before
305 Assessment of Team Projects in an Electrical Power Systems Course Bruno Osorno California State University, NorthridgeAbstractWith team-project-based courses the challenge has been assessment. Various attempts have beeneffective in one dimension and weak in another. Utilizing a different method, a comprehensiveassessment of a team-project-course is discussed in this paper. In this 15-week course, twomidterms , a final exam, and weekly 15-minute quizzes and homework assignments wereadministered in addition to five team projects. The relationship between
Excellence in Teaching, and 2005 Mechanical Engineering Instructor of the year award, 1999 ASEE-GSW Outstanding New Faculty Award. His teaching and research interests are in the thermal sciences. In 2015-2016, he chaired the American Society for Engineering Education Gulf Southwest section and in 2018-2019 he chaired the Academy of Distinguished Teaching Scholars at UTSA. He is a registered Professional Engineer in Texas. ©American Society for Engineering Education, 2025 1 FSession 9 Assessing the Impact of Artificial Intelligence on Undergraduate Mechanical
Development of Web-service Exam to Improve Integrity of Remote Assessment Douglas E. Dow School of Engineering, Wentworth Institute of Technology Boston, Massachusetts, USA dowd@wit.edu Abstract—COVID-19 and remote learning challenged the I. INTRODUCTIONintegrity of exams. At-home, unproctored, and web-based examsresulted in increased reports of students engaging in exam-taking Authentic learning is central to the value of education and atactics outside
learning, alternative grading, and design thinking, he also co-founded the STEPS program (funded through NSF S-STEM) to support low-income, high-achieving engineering students. Budischak holds a Doctorate in Electrical Engineering and enjoys outdoor activities with his family. FYEE 2025 Conference: University of Maryland - College Park, Maryland Jul 27Work In Progress: Enhancing Student Collaboration Through Growth-Based Assessment PracticesIntroductionBackgroundIn a broad literature review, Geisinger and Raman summarized many factors related to studentattrition from engineering majors [1]. The authors noted that competitive grading environmentscommonly found in STEM disciplines have been linked with
Paper ID #37779Assessing Entrepreneurial Mindsets – A Work-In-Progresspaper exploring how to create and deploy quantitative andqualitative assessments for student entrepreneurial mindsetdevelopmentAubrey Wigner (Assistant Professor) Dr. Aubrey Wigner was an Assistant Professor at MSU Broad Business College, where he taught and developed courses for the Minor in Entrepreneurship and Innovation. Starting in the Fall of 2022 he will move to Colorado School of Mines to join the Engineering, Design, & Society team in teaching capstone, cornerstone, and design. He emphasizes deep engagement and hands-on practices in
Paper ID #38323Assessing Socially Engaged Engineering Training onStudents’ Problem Solving: The Development of a Scenario-based Assessment ApproachElizabeth Rose Pollack (PhD Student)Erika Mosyjowski Erika A. Mosyjowski (she/her/hers) works for the Center of Socially Engaged Design at the University of Michigan as the Research and Faculty Engagement Manager. She has a B.A. in Psychology and Sociology from Case Western Reserve University and a M.A. and Ph.D. in Higher Education from the University of Michigan. Her research interests include engineering culture, fostering engineers’ sociocultural and contextual
Paper ID #18427Assessing Students’ Global and Contextual Competencies: Three Categoriesof Methods used to Assess a Program with Coursework and InternationalModulesDr. David B. Knight, Virginia Polytechnic Institute and State University David Knight is an Assistant Professor and Director of International Engagement in the Department of Engineering Education and affiliate faculty with the Higher Education Program, Center for Human- Computer Interaction, and Human-Centered Design Program. His research focuses on student learning outcomes in undergraduate engineering, learning analytics approaches to improve educational practices
AC 2009-2513: A BLOOM’S ONLINE ASSESSMENT TEST (BOAT) TO ASSESSSTUDENT LEARNING OUTCOME IN A DISTANCE ENGINEERINGEDUCATION COURSEPrakash Ranganathan, University of North DakotaRichard Schultz, University of North Dakota Page 14.4.1© American Society for Engineering Education, 2009 Using DAQ boards to communicate with NXT in measurements and Instrumentation applications Prakash Ranganathan 1, Richard Schultz 2 Department of Electrical Engineering University of North Dakota Grand Forks, North Dakota 58202 email:1prakash.ranganathan@mail.und.edu 2
AC 2009-428: ASSESSING CREATIVITY IN ARCHITECTURAL DESIGN:EVIDENCE FOR USING STUDENT PEER REVIEW IN THE STUDIO AS ALEARNING AND ASSESSMENT TOOLJoseph Betz, State University of New York Joseph A. Betz is an architect and Professor in the Department of Architecture & Construction Management at the State University of New York College of Technology at Farmingdale. He received his undergraduate and professional degrees in architecture from the Rensselaer Polytechnic Institute and his post-professional degree in architecture from Columbia University. A recipient of the SUNY Chancellor's Award for Excellence in Teaching, he has served as both national Program Chair and Division Chair of the
Paper ID #28410CoOrdinated Math-Physics Assessment for Student Success (COMPASS)assessments on continuing math courses and attitude toward mathDr. Guangming Yao, Clarkson University Guangming Yao (Associate Professor and Executive Officer, Clarkson University): Professor Yao re- ceived her BSc and MS in Mathematics and Applied Mathematics from Harbin Normal University and PhD in Computational Mathematics from the University of Southern Mississippi. She joined the Depart- ment of Mathematics and Computer Science at Clarkson University in 2012. Her research focuses on computational and applied PDEs and radial basis functions
AC 2008-316: CONNECTING THE DOTS IN ASSESSMENT: FROM COURSESTUDENT LEARNING OBJECTIVES TO EDUCATIONAL PROGRAMOUTCOMES TO ABET ASSESSMENTEsteban Rodriguez-Marek, Eastern Washington University ESTEBAN RODRIGUEZ-MAREK did his graduate work in Electrical Engineering at Washington State University. He worked as a research scientist at Fast Search & Transfer before transferring to the Department of Engineering & Design at Eastern Washington University. His interest include image and video processing, communication systems, digital signal processing, and cryptographic theory and applications.Min-Sung Koh, Eastern Washington University MIN-SUNG KOH obtained his B.E. and M.S. in Control and
Session 15470 Implementing classroom outcomes assessment (TAC) with commercially available software. A Computerized Approach to Outcomes Assessment A Pilot Study Bill Drake, Ph.D., Southwest Missouri State University Douglas Walcerz, Ph.D., Enable Technologies, Inc. AbstractThe Industrial Management program at Southwest Missouri State University (SMSU) has begun theprocess of designing and implementing an outcomes assessment process for to continuous improvement ofthe programs
Paper ID #43090Board 242: Developing Valid and Equitable Tasks for Assessing ProgrammingProficiency: Linking Process Data to Assessment CharacteristicsDr. Mo Zhang, Educational Testing ServiceAmy Jensen Ko, University of WashingtonCHEN Li, Educational Testing Service ©American Society for Engineering Education, 2024 Developing Valid and Equitable Tasks for Assessing Programming Proficiency: Linking Process Data to Assessment Characteristics Mo Zhang Amy Ko mzhang@ets.org ajko@uw.edu
Life Cycle Assessment and Sustainability Analysis of Lignin Derivative Products, Materials and Technologies. Integrated Process Modeling, Scientific framework and LCA for Assessing Sustainability Arash Jahandideh, Agricultural Engineering Department South Dakota State University Brookings, SD 57007 Arash.Jahandideh@sdstate.edu Samaeh Aminikhangahi Computer Science Department Washington State University Pullman, WA 99163
Indicators (KPIs). Key Performance Indicators (KPIs) are a measurement tool that quan-tifies the progress toward an intended strategic business outcome. KPIs allow for a methodicaloperational improvement, create an analytical basis for decision making and identify significantattributes. Managing a business with KPIs includes setting the desired level of performance andtracking progress against those levels.Our project focused on optimizing custom circuitry assembly, which will benefit our industry part-ner’s efforts in improving quote estimates and number-of-profitability. The number-of-profitabilityratio is a category of financial measurement tools that are used to assess a business’s ability togenerate earnings relative to its revenue, operating
test usage in engineering courses. Tests and exams are typically heavily usedin FECs like statics, dynamics, thermodynamics, and other courses in various engineeringdisciplines. Understanding why engineering instructors heavily rely on tests to assess studentlearning in these courses can be crucial in promoting the use of more diverse types ofassessments, such as portfolios, concept inventory, reflection-based practices, project-basedpractices, and intentionality in terms of designing, administering, and interpreting tests, butresearch has been scarce on documenting research on this topic. Conversations around why instructors make certain course decisions typically involve thecontexts these instructors are situated in, emphasizing how
Paper ID #37745Exploring the Alignment of Instructor’s Intent and Students’ Perceptionof Using Self-Assessment in an Engineering Undergraduate CourseMr. Viyon Dansu, Florida International University Viyon had his Bachelors and Masters degrees in Systems Engineering. Thereafter he co-founded STEM- Ed Africa, a social enterprise involved in developing products and services geared at teachers’ devel- opment and improving high school student’s problem-solving abilities in STEM subject areas. He is currently a doctoral student of engineering and computing education at Florida International University, Miami.Mr. Yashin Brijmohan
Conference & Exposition, Minneapolis, Minnesota, 2022.[10] "Upload Questions - Blackboard Help," [Online]. Available: https://help.blackboard.com/Learn/Instructor/Original/Tests_Pools_Surveys/Orig_Reuse_ Questions/Upload_Questions. [Accessed 1 10 2022].[11] B. Liengme and K. Hekman, Liengme's Guide to Excel 2016 for Scientists and Engineers, Academic Press, 2019.[12] K. A. Jackson, "Learning from Mistakes: A Different Approach to Partial Credit," [Online]. Available: https://www.facultyfocus.com/articles/educational- assessment/learning-from-mistakes-a-different-approach-to-partial-credit/. [Accessed 27 1 2022].[13] J. L. Hieb and C. R. Bego, "Turning the Tables on Partial Credit: Computer Aided Exam with Student
Paper ID #37898Assessment of Changes in Confidence and Judgements ofProblem-Solving Processes in Senior Level ChemicalEngineering StudentsSheima KhatibJessica C Pittman Jessica is a 4th year graduate student at Texas Tech University in the Cognitive Experimental Psychology doctoral program. He interests broadly revolve around student self-regulation in the context of higher education.Roman Taraban (Professor) Professor in Psychological Sciences © American Society for Engineering Education, 2022 Powered by www.slayte.com Assessment of Changes in
regularly mentors faculty and facilitates workshops on instructional design, Quality Matters assessments, and novel edtech applications. She is also the acting liaison for the Office of Institutional Assessment and Accreditation, and creates online assessment resources and facilitates webinars and workshops to all levels of administration and faculty to demonstrate how to leverage assessment data in service to continuous programmatic improvement and resource acquisition. Her research interests include STEM communications pedagogy, cognitive empathy, industry-academia interaction, teaching and learning.James RighterRobert J. Rabb (Chair, Mechanical Engineering) Professor, Mechanical Engineering, The Citadel
. The two groups compared were students who scored 50% or below on thepre-test, indicating a lower prior knowledge, and those who scored above 50% on the pretest,indicating a higher prior knowledge. The JMP program was used to analyze the data using t-testconnecting letters reports. Results from these tests were compared to examine differences inscore increases for those with lower vs. higher prior knowledge for each of the three LCDLMsand then for the collective scores of all three together. It should be noted that the mode ofimplementation, virtual, hybrid, or in person, were not separately considered in this analysis.3.0 Results and DiscussionResults from average assessment scores show significantly greater impact on understanding ofconcepts
classes, assessment measures of both students and faculty, and the effects on student learning of increased reliance on teaching-faculty without tenure. © American Society for Engineering Education, 2022 Powered by www.slayte.com The benefits of writing machine-graded final exams to be capable of more nuanced feedback in large foundational mechanics courses.AbstractWe discuss an approach to multiple-choice exams that awards partial credit to students whomake minor common mistakes when calculating their numerical solutions, in order to promotemore nuanced feedback and grading. Assessing student performance in large
on auto-graded homework when using an interactive textbook," in ASEE Annual Conference, 2020, pp. 1-12, doi: https://peer.asee.org/35116.[14] M. Richards-Babb, R. Curtis, Z. Georgieva, and J. H. Penn, "Student Perceptions of Online Homework Use for Formative Assessment of Learning in Organic Chemistry," Journal of Chemical Education, vol. 92, no. 11, pp. 1813-1819, Nov 10 2015, doi: https://doi.org/10.1021/acs.jchemed.5b00294.[15] G. Kortemeyer, E. Kashy, W. Benenson, and W. Bauer, "Experiences using the open-source learning content management and assessment system LON-CAPA in introductory physics courses," American Journal of Physics, vol. 76, no. 4, p. 438, 2008, doi: https://doi.org
ThemesClaire situated her rich picture in the physical classroom environment and primarily focused onher perceptions of the interaction and dynamics between the students and the instructor (Figure4). In her rich picture, some students were confused and needed help, others understood, and onewas distracted. In her verbal description of her image, Claire emphasized a divide between thestudents who “got it” and those who did not. This perspective reflects her efficacy beliefs orbeliefs about her and other students’ abilities to meet the course expectations based on theassigned tasks and assessments [23]. Also, she included the course instructor standing in front ofthe classroom (facing the students) beside a board. On the board, the instructor provided
expressed in terms ofspecific stress components and not in terms of effective, e.g., Von Mises, quantities. The resultsfrom the 2D FEA code may also be compared with results from a commercial FEA program suchas ANSYS Workbench to help the students directly see the integration of FEA theory in practice.A cursory assessment of the effectiveness of this process has been measured via a Qualtricssurvey distributed towards the end of the term. The survey intends to gather the students’subjective experience regarding MATLAB Grader and the development and use of the 2D FEAcode. Survey results are separated from any student identifiers and are not tied to studentperformance in the course. The authors intend to pursue a more rigorous assessment regardingthe
, and fall 2021 semesters at the department of electrical andcomputer engineering hosting this study. We used graded assignments and exams to assess thestudents' performance on flipped modules versus their performance on modules taught usingtraditional lecturing. We statistically compare students' performance in the flipped modules tothose in the non-flipped modules. This quantitative comparison enabled us to evaluate flippedinstruction's potential impact on student learning within each course offering. Also, we studied thestudents' perspectives on flipped learning and the activities conducted during class time throughonline surveys. Content analysis was used to qualitatively measure the students' perspective onflipped learning.The various