June 20, 2010
June 20, 2010
June 23, 2010
Educational Research and Methods
15.1008.1 - 15.1008.48
Quality Indicators for Engineering & Technology Education
In recent years the development and use of university rankings, comparisons, and/or league tables has become popular and several methodologies are now frequently used to provide a comparative ranking of universities. These rankings are often based on research and publication activity and also not uncommonly focus on indicators that can be measured rather than those that should be measured. Further, the indicators are generally examined for the university as a whole rather than for university divisions, departments or programs. Implicit also is that placement in the rankings is indicative of quality. This paper provides an overview of the methodologies used for the more popular rankings and summarizes their strengths and weaknesses. It examines the critiques of rankings and league tables to provide appropriate context. The paper then examines the issue of how a university (or a college or program) could be assessed in terms of the quality of its engineering and technology programs. It proposes a set of indicators that could be used to provide relative measures of quality, not so much for individual engineering or technology programs, but rather of the university.
Introduction & Methodology
Today's world, and by all indicators the world of the future, seems to be increasingly competitive  and demanding. Resource scarcity, an increasing imperative for efficiency and effectiveness, manifestly more available information and escalating expectations for quality are but some of the factors that have caused universities, colleges, departments and programs to attend to evaluation, accreditation and invariably rankings and comparisons [2, 3]. Furthermore, increased global and intra-national mobility as well as widespread access to information has created the opportunity for individuals to more carefully research their selection of universities to attend .
Perhaps in response to such pressures, there seems to have been an upsurge in the number of agencies, centers, corporations and others concerned with rankings and comparisons (see Appendix A). The International Observatory on Academic Ranking and Excellence (IREG), The Institute for Higher Education Policy (IHEP), The University of Illinois Education and Social Science library has compiled an extensive set of resources on rankings, which are reproduced in the appendices with permission. There have been numerous conferences addressing this topic as well [5, 6]. Notably, many of the most significant players in the ranking/comparison field have agreed upon a formal set of principles that define quality and good practice for rankings and comparisons . These are presented in Appendix B.
The authors, in collaboration with their university reference librarians and institutional researchers, conducted an extensive review of the periodical, book, and conference literature. This activity surfaced over 20 different ranking/rating/comparison schemes with significant presence [samples are provided in Appendices C and D] and undoubtedly a multitude of additional ones exist. But, the authors are compelled to ask – What purposes are served by such comparisons [3, 8, 9, 10] and why so many?
Dyrenfurth, M., & Murphy, M., & Bertoline, G. (2010, June), Quality Indicators For Engineering & Technology Education Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16889
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015