Asee peer logo

Iterative Driven Competency-Based Assessment in a First-Year Engineering Computation Module

Download Paper |

Conference

2025 ASEE Annual Conference & Exposition

Location

Montreal, Quebec, Canada

Publication Date

June 22, 2025

Start Date

June 22, 2025

End Date

August 15, 2025

Conference Session

First-Year Programs Division (FPD) Technical Session 11: Shaping Engineers - Competency, Creativity, and Iteration in the First Year

Tagged Division

First-Year Programs Division (FPD)

Page Count

16

DOI

10.18260/1-2--56902

Permanent URL

https://peer.asee.org/56902

Download Count

5

Paper Authors

biography

James Bittner Michigan Technological University Orcid 16x16 orcid.org/0000-0002-0916-1658

visit author page

James Bittner is an Assistant Teaching Professor in the Engineering Fundamentals Department at Michigan Technological University. His recent courses focus on foundational engineering subjects, including statics, design practices, and computational problem-solving, emphasizing active learning methodologies in his classroom. He has research experience in explorative active learning practices, nondestructive testing of civil infrastructure materials and nonlinear wave theory. Prior to his academic career, he worked as an engineer in the maritime construction industry, specializing in hydraulic sediment transport and geotechnical analysis.

visit author page

biography

Matt Barron Michigan Technological University

visit author page

Dr. Barron's teaching interests include solid mechanics, engineering fundamentals, and transitional mathematics. His research interests include educational methods, non-cognitive factors, and bone tissue engineering. Prior to MTU, Dr. Barron worked for Bay de Noc Community College for eleven years and he also has several years of experience working for Kimberly-Clark Corporation in Research and Development.

visit author page

biography

AJ Hamlin Michigan Technological University

visit author page

AJ Hamlin is a Principle Lecturer in the Department of Engineering Fundamentals at Michigan Technological University, where she teaches first-year engineering courses. Her research interests include engineering ethics, spatial visualization, and educatio

visit author page

Download Paper |

Abstract

This Complete Evidence Based Practice paper will explore one tool for supporting competency based assessment in a first-year engineering course. Competency-based assessment in a first-year engineering computation module offers a pathway to improve student engagement and enhance learning outcomes. Shifting the focus from traditional one-try assessment to a more dynamic evaluation of core computational skills—such as algorithmic loops, plotting, and functions—can enable deeper personalized learning experiences. The primary challenge is creating a more responsive, interactive relationship with every student, regardless of their previous content knowledge. Autograding systems can play a pivotal role in this relationship by providing instant, real-time feedback on students' efforts. One approach to autograding systems is to allow autograding to occur during the assessment in an iterative process. To be effective, these systems must be designed to not only evaluate correctness but also analyze visual outputs like graphs and assess the intermediate steps of computation. This immediate iterative feedback loop follows the techniques content experts often deploy to solve challenging problems. This technique guides students to identify and correct mistakes as they learn, fostering deeper engagement with the material. By integrating real-time feedback driven evaluations, educators can create a more engaging learning environment that promotes essential computational reasoning skills. However, crafting automated feedback is time intensive and cost prohibitive, especially the first time. Collaborating problem sets, documenting observations and improvements we can aid to reduce these negative obstacles for broad implementation. In this paper we document the process implemented to transition a first-year engineering class MATLAB assessment into an autograded environment. We will demonstrate techniques to evaluate the components of a proper figure, and ways to randomize a problem in the commercial Mathworks Grader environment. We will compare student performance on the assessment, student’s perception on the experience and explore the effect on uniqueness in submissions. The students' performance will be compared with a prior year's standard assessment results, and students' perception will be compared with a common end of course survey. Uniqueness of submissions will be evaluated with a tool to identify a percentage of similar lines of code. In the process of running an autograded environment educators are exposed to every early submission, so a metric of identifying which assessment objectives are the most challenging is collected as well. Through implementation of autograded assignments, our courses have identified a decrease in the time to engage with a challenging problem and ask questions. One core issue identified in deployment is the challenge in creating multiple problem sets or banks and the difficulty in writing broad validation code. The anticipated survey and performance results will discuss observed student performance, perception and the amount of non-unique submissions. This approach supports individual learning needs and better prepares students for future computational engineering challenges by making assessment a more dynamic and impactful part of their educational experience.

Bittner, J., & Barron, M., & Hamlin, A. (2025, June), Iterative Driven Competency-Based Assessment in a First-Year Engineering Computation Module Paper presented at 2025 ASEE Annual Conference & Exposition , Montreal, Quebec, Canada . 10.18260/1-2--56902

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2025 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015