Asee peer logo

Annual Documentation of Assessment and Evaluation of Student Outcomes Simplifies Self-Study Preparation

Download Paper |

Conference

2016 ASEE Annual Conference & Exposition

Location

New Orleans, Louisiana

Publication Date

June 26, 2016

Start Date

June 26, 2016

End Date

June 29, 2016

ISBN

978-0-692-68565-5

ISSN

2153-5965

Conference Session

Assessment & Accreditation in ECE

Tagged Division

Electrical and Computer

Page Count

12

DOI

10.18260/p.26252

Permanent URL

https://peer.asee.org/26252

Download Count

586

Request a correction

Paper Authors

biography

Zia A. Yamayee University of Portland

visit author page

Dr. Yamayee's current professional interests include outcomes assessment in engineering education; design in engineering education; engineering design methodologies; and application of design methods to electric power distribution, transmission, and generation. Dr. Yamayee's work to date has included projects in power system planning, maintenance scheduling, hydrothermal simulations, unit commitment, operational and financial impacts of integrating new technologies with power systems, probabilistic production simulations, and integrated resource planning. In recent years, he has authored a number of articles and has given numerous presentations on outcomes-based engineering curriculum development and the implementation of the ABET Criteria for Accrediting Engineering Programs.

His professional experience includes more than 33 years of university administration, teaching, consulting, and research, as well as five years of full-time work in industry.

visit author page

biography

Peter M. Osterberg University of Portland

visit author page

Dr. Peter Osterberg is an associate professor in Electrical Engineering at the University of Portland (Portland, OR). He received his BSEE and MSEE degrees from MIT in 1980. He received his Ph.D. degree in electrical engineering from MIT in 1995 in the field of MEMS. He worked in industry at Texas Instruments, GTE, and Digital Equipment Corporation in the field of microelectronics. His research interests are microelectronics, MEMS, and nanoelectronics.

visit author page

Download Paper |

Abstract

Electrical Engineering (EE) programs seeking accreditation from the EAC of ABET must demonstrate that they satisfy eight general accreditation criteria, plus any program specific criteria. Two of the most challenging and debated criteria are: Criterion 3. Student Outcomes (SOs); and Criterion 4. Continuous Improvement. To prepare our EE program for a successful accreditation review, we divided the six-year ABET accreditation cycle into three distinct phases; the years before the Self-Study year (phase one), the Self-Study year (phase two), and the visit year (phase three).

During phase one of the accreditation cycle (2010-2014) a number of direct and indirect assessment methods were used to assess and evaluate Student Outcomes. The results were used to identify program improvements. The program faculty documented the results in annual assessment and evaluation reports. During the Self-Study year (2014-2015), we used the annual reports to prepare the Self-Study report. The annual reports also provide evidence that improvements to our EE program were based on assessment and evaluation of SOs as well as other inputs.

At the heart of our assessment program lies course-embedded assessment. The choice of courses for course-embedded assessment is guided by two principles: (1) each Student Outcome is assessed with student work in a benchmark course, and (2) only required courses, not elective courses, in the curriculum are selected as benchmark courses.

Assessment of a benchmark course is conducted with the following in mind: (1) assessment of student work measures the extent to which SOs are being attained, (2) it is not necessary to use all of the student work to assess an outcome, and (3) outcomes assessment is based upon student work and is guided by the grading of that work.

EGR 360-Analysis of Engineering Data was selected as a Benchmark Course for the EAC Student Outcome b (an ability to design and conduct experiments, as well as to analyze and interpret data.) To determine the degree to which Student Outcome b is attained, the following Performance Indicators were used:

Performance Indicator b.1. Analyze data to determine specified quantities. Exam problems asked students to determine mean and standard deviation for a random sample, and apply the Central Limit Theorem to calculate probabilities.

Performance Indicator b.2. Interpret the results for correctness and precision or apply the results to a pre-assigned problem. Students were asked to specify the value of a test statistic and draw a conclusion based on a statistical hypothesis test.

Performance Indicator b.3. Understand and apply concepts of randomization in experimental design. Students were asked to identify factors that would introduce variability in replicating an experiment, such as the manufacture of a given product.

In this paper, a detailed description of the process, data collection efforts, and analysis of the results in applying course embedded assessment method to the EGR 360-Analysis of Engineering Data course are provided.

Yamayee, Z. A., & Osterberg, P. M. (2016, June), Annual Documentation of Assessment and Evaluation of Student Outcomes Simplifies Self-Study Preparation Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.26252

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015