Asee peer logo

Six Years And Thousands Of Assignments Later: What Have They Learned, And What Have We Learned?

Download Paper |

Conference

2007 Annual Conference & Exposition

Location

Honolulu, Hawaii

Publication Date

June 24, 2007

Start Date

June 24, 2007

End Date

June 27, 2007

ISSN

2153-5965

Conference Session

ECE Pedagogy and Assessment

Tagged Division

Electrical and Computer

Page Count

22

Page Numbers

12.1281.1 - 12.1281.22

DOI

10.18260/1-2--1599

Permanent URL

https://peer.asee.org/1599

Download Count

345

Paper Authors

biography

J. Shawn Addington Virginia Military Institute

visit author page

J. Shawn Addington is the Jamison-Payne Institute Professor and Head of the Electrical and Computer Engineering Department at the Virginia Military Institute. He received his B.S., M.S., and Ph.D. degrees in Electrical Engineering from Virginia Polytechnic Institute and State University. He teaches courses, laboratories, and undergraduate research projects in the microelectronics and semiconductor fabrication areas; and, he remains active in curriculum development and engineering assessment. He is a registered professional engineer in the Commonwealth of Virginia, and is a member of ASEE, IEEE, and several other engineering professional and honor societies.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Six Years and Thousands of Assignments Later: What Have They Learned, and What Have We Learned?

Abstract

Following the birth of Engineering Criteria 2000, many programs have since had the opportunity to fully develop, evaluate, and revise their assessment schemes. Most importantly, programs have now had ample opportunity to use feedback from these assessment schemes to effect improvements within their programs. The purpose of this paper is to illustrate both the formative and summative phases of assessment that have been, and continue to be, used in the Electrical and Computer Engineering Department at this institution. The paper begins with an overview of the overall assessment system utilized by this program, including program-level and course-level assessments and the various feedback loops associated with each. Among the many assessment tools being utilized by the program is the use of student performance data to enable (1) real-time formative feedback to the instructor, regarding student achievement at the course-level, as well as (2) the summative evaluation of outcomes achievement at the program-level, in both short- term and long-term studies. While significant consideration is given to the assessment processes, this paper will focus on the important, overarching issue of how the data from these processes have been used to effect program changes, evaluate the effectiveness of previous program changes, validate program direction and philosophy, and influence future planning at both the program- and course-levels.

In recent years, there have been a significant number of publications that report on the various assessment strategies being employed by numerous institutions; however, there appear to be very few strategies that have matured to the point of being able to provide details on the use of the data gathered from these schemes, especially over the long-term. This paper attempts to address this apparent deficiency of information, and hopefully initiate further discussions and refinements on the topic.

Introduction

“Assessment isn’t a once-and-done project… It is, instead, a continuous cycle of raising questions and finding some answers that raise more questions.”1

With the advent of EC 2000 came a dramatic shift from the oft-referenced “bean-counting” approach to a primarily “outcomes-based” approach to engineering assessment. For several years thereafter, institutions struggled with this transition and the inherent need to develop radically different assessment processes. Common questions such as “How do I measure that?”, “What is the difference between an outcome and an objective?”, “What, and how much, data do I need?” dominated the attention of programs around the country during their planning stages.2,3,4,5,6,7,8 As a result, important questions like “How can I use these data to improve my course and my program?” were often deferred to a later date when more data were available and/or when assessment plans had matured sufficiently. Until then, programs could only offer limited, primarily qualitative, evidence of the use of assessment data in effecting program

Addington, J. S. (2007, June), Six Years And Thousands Of Assignments Later: What Have They Learned, And What Have We Learned? Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--1599

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015