Asee peer logo

Curricular Assessment Using Existing On Campus Information Databases

Download Paper |


2007 Annual Conference & Exposition


Honolulu, Hawaii

Publication Date

June 24, 2007

Start Date

June 24, 2007

End Date

June 27, 2007



Conference Session

Innovations in Mechanical Engineering Education Poster Session

Tagged Division

Mechanical Engineering

Page Count


Page Numbers

12.432.1 - 12.432.10



Permanent URL

Download Count


Request a correction

Paper Authors


Andrew Kean California Polytechnic State University

visit author page

Andrew Kean is an Assistant Professor of Mechanical Engineering at California Polytechnic State University, San Luis Obispo. He received his Ph.D. from University of California, Berkeley in 2002 and his B.E. from The Cooper Union in 1997. His interests include energy conversion, climate change, air pollution, and sustainability.

visit author page


Glen Thorncroft California Polytechnic State University

visit author page

Glen Thorncroft is an Associate Professor of Mechanical Engineering at California Polytechnic State University, San Luis Obispo. He received his Ph.D. from the University of Florida in 1997, with a research emphasis in Boiling Heat Transfer. His current activities focus on improvement of undergraduate laboratory education, including new experiments, instrumentation, and pedagogy in Fluid Mechanics and Thermal Sciences, as well as introducing Uncertainty Analysis into the undergraduate curriculum.

visit author page

Download Paper |

NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Curricular Assessment Using Existing On-Campus Information Databases


Assessment of engineering program success is critical for continual improvement. While this assessment can take many forms, this work outlines an underutilized method of indirect assessment that takes advantage of already existing campus-wide information databases. Most university campuses have some form of information database which contain student records, course records, and/or faculty records. The methodology of using these databases to assess program performance is motivated by the popular book “Freakonomics” by Levitt and Dubner (William Morrow, 2005). While somewhat limited in depth, the scope of questions which can be answered with the databases are only limited by the creativity of the analyst. Of particular interest to the authors are the trends in student grades for key courses (e.g., statics and thermodynamics) over time, as department personnel have changed significantly. Also, we were curious to see connections between success in a prerequisite course versus a follow-up course. This work outlines some of the obvious and not so obvious assessments that are possible, as well as identifies potential pitfalls the analyst should avoid.


It is the goal of most engineering education programs to accomplish continual improvement. To address this goal, assessment of the program’s success in achieving stated learning outcomes is necessary. For this reason, ABET Criterion 3 for 2006-2007 Accreditation Cycle requires identification and assessment of program outcomes. Extensive efforts to improve assessment in education, and specifically engineering education, have already been performed (e.g., Astin, 1991; Shaeiwitz, 1996; Ewell, 1998; Pelligrino, Chudowsky, and Glaser, 2001; Olds, Moskal, and Miller, 2005). Generally, program assessment at the department level can be a time- consuming and expensive effort for the members of a department. Faster and easier methods for assessing program success and improvement would be welcome by many engineering departments.

This manuscript describes an underutilized method of assessment based on existing campus-wide information databases. The indirect assessment methodology is motivated by the popular book “Freakonomics” by Steven Levitt and Stephen Dubner (William Morrow, 2005). This text presents a novel way of discerning how people behave in the real world. The authors analyze existing datasets to obtain unexpected answers to some creative questions. Their analysis is based on the two key concepts: 1) that human behavior is strongly influenced by incentives, and 2) the conventional wisdom is often wrong.

With these fundamental concepts in mind, we present a methodology for the specific application of assessment of engineering programs. Ewell (1989 and 1998) has pointed out previously that capitalizing on existing data is a key approach for assessment implementation. The hope of the authors of the present work is to provide a useful technique for understanding the performance of our students and faculty better.

Kean, A., & Thorncroft, G. (2007, June), Curricular Assessment Using Existing On Campus Information Databases Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--2938

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015