Asee peer logo

Engineering Online Gateway System Ensuring And Evaluating Student Learning Through Automated, Milestone Exams

Download Paper |

Conference

2010 Annual Conference & Exposition

Location

Louisville, Kentucky

Publication Date

June 20, 2010

Start Date

June 20, 2010

End Date

June 23, 2010

ISSN

2153-5965

Conference Session

Computer Education Innovations I

Tagged Division

Computers in Education

Page Count

10

Page Numbers

15.495.1 - 15.495.10

DOI

10.18260/1-2--16253

Permanent URL

https://peer.asee.org/16253

Download Count

308

Request a correction

Paper Authors

biography

Marcial Lapp University of Michigan

visit author page

Marcial Lapp is a graduate student in the Industrial and Operations Engineering Department at the University of Michigan. His research interests lie in modeling and solving large-scale optimization problems focused on the transportation and logistics industries, as well as improving undergraduate engineering education through innovative teaching technology. He holds a Masters and a Bachelors degree in Computer Science from the University of Michigan. His email is .

visit author page

biography

Jeffrey Ringenberg University of Michigan

visit author page

Jeff Ringenberg is a lecturer at the University of Michigan's College of Engineering. His research interests include mobile learning software development, tactile programming, methods for bringing technology into the classroom, and studying the effects of social networking and collaboration on learning. He holds BSE, MSE, and PhD degrees in Computer Engineering from the University of Michigan.

visit author page

biography

T. Jeff Fleszar University of Michigan

visit author page

Jeff Fleszar is a graduate student at the Michigan Ross School of Business. He has a research interest in improving engineering education through the incorporation of technology. He holds an MSI from the School of Information in Human Computer Interaction and a BS in Computer Science from the University of Michigan.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Engineering Online Gateway System Ensuring and Evaluating Student Learning through Automated, Milestone Exams

Abstract Many engineering courses use a sequential teaching strategy by which new material builds on concepts previously presented. While such a strategy lends itself to a natural presentation of course concepts, students who do not have a solid grasp of the initial material often fall behind and continue to struggle through the remainder of the course. To combat the problem of the “struggling student”, we present a computer-based examination system that can be used at various times throughout the semester to ensure students have grasped the vital concepts of the course up to that particular point. This examination system can be used as a "gateway" through which all students must pass prior to taking a regular exam. Depending on the outcome of this gateway assessment, students may be required to seek help from the professor or a graduate student instructor before taking the regular exam. These help sessions focus primarily on the areas of the gateway assessment where improvement is needed as indicated by the students' gateway results. Through the development of this computer-based examination system, which can provide real-time C++ code compilation and testing, we seek to ensure adequate comprehension of the material presented in an introductory engineering/programming course. We have gathered statistically significant evidence that suggests a strong correlation between a student’s performance on our automated gateway system and their upcoming exam performance. This indicates that the gateway assessment performance is indicative of overall course performance. We also present ideas for further adoption of our gateway system throughout the engineering education community. 1. Introduction

Common across many engineering schools, entering students are expected to complete a set of core courses, consisting of mathematics, science, physics, and computer programming. As previous researchers, like Werth1 (1986) and Bergin2 (2005), have noted, computer programming tends to be a difficult subject for many students, resulting in abnormally high attrition rates. Furthermore, subpar performance during a first-year course, such as computer programming, can often lead to student self-doubt and a subsequent departure from the engineering degree program. While many articles exist that detail possible factors to predict student performance in a computer science course, two common problems overshadow their effectiveness: 1) Predictors, such as previous programming knowledge and various performance indicators such as the GRE or SAT, are not readily available when incoming students arrive. 2) Predicting factors require prior data collection to be effective and accurate at predicting performance. In this research study instead of predicting a student’s performance in a computer programming course, we focus on a new measurement and evaluation system to ensure continual student learning of course material throughout the semester, but more importantly, before any midterm

Lapp, M., & Ringenberg, J., & Fleszar, T. J. (2010, June), Engineering Online Gateway System Ensuring And Evaluating Student Learning Through Automated, Milestone Exams Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16253

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015