Asee peer logo

A Proposed Doctoral Assessment Procedure And Rubric For Science And Engineering

Download Paper |

Conference

2010 Annual Conference & Exposition

Location

Louisville, Kentucky

Publication Date

June 20, 2010

Start Date

June 20, 2010

End Date

June 23, 2010

ISSN

2153-5965

Conference Session

Graduate Student Experience

Tagged Division

Graduate Studies

Page Count

12

Page Numbers

15.78.1 - 15.78.12

DOI

10.18260/1-2--16106

Permanent URL

https://peer.asee.org/16106

Download Count

1724

Paper Authors

author page

David Vaccari Stevens Institute of Technology

author page

Siva Thangam Stevens Institute of Technology

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

A PROPOSED DOCTORAL ASSESSMENT PROCEDURE AND RUBRIC FOR SCIENCE AND ENGINEERING David A. Vaccari and Siva Thangam Stevens Institute of Technology

Abstract: Learning outcomes assessment has been ascendant throughout higher education, but little has been developed at the doctoral level. An assessment procedure for the doctoral studies is proposed that has two parts: (1) an evaluation of publication rates within two years after completion of the degree, and (2) an assessment of the dissertation and the defense using a number of criteria. The criteria were based on a review of the online literature plus additional criteria developed ourselves. Common criteria include originality, advancing of the state of the art, and demonstration of a high degree of mastery. The additional criteria include: demonstration of mastery of the literature; the work has academic or practical utility; the work uses advanced or novel techniques; the work has elements of both theory and experiment. Several other criteria are linked to our institution’s mission, including: The work may lead to marketable technology; the candidate demonstrates ability to communicate orally and in writing at a high level. Note that not all these criteria are requirements for success; some are intended to be used to evaluate the program, and not the candidate.

A detailed rubric for the evaluation of the doctoral dissertation and the oral defense was developed. A rubric makes evaluation of the criteria less subjective, and can serve as a guide for both the dissertation committee as well as for the doctoral candidate. The rubric was pilot-tested with several engineering doctoral defenses in engineering programs. The results validated the rubric against concerns that dissertation committees would be reluctant to rate a dissertation that the committee passed with anything less than top scores. The results also were revealing of the actual standards used by doctoral dissertation committees in evaluating the dissertation and defense.

Introduction:

Learning outcomes assessment has become a standard part of higher education. Inspired by quality control approaches used in industry, it began to be required in education by specialized accreditation agencies such as ABET, Inc.1 and The Association to Advance Collegiate Schools of Business (AACSB)2. More recently, similar requirements have been adopted by regional accreditation agencies, such as the Middle States Commission on Higher Education3, which has promulgated a requirement that all offerings, including graduate programs, have “program goals that are stated in terms of student learning outcomes.” Graduate programs are not required by Middle States to be assessed as strictly as undergraduate programs. Nevertheless, they should have “periodic evaluation of the effectiveness (of its educational offerings) … and utilization of evaluation results as a basis for improving its student development program and for enabling students to understand their own educational progress.”

Little information could be found on assessment of the doctoral education. Some studies focused on assessing the work of the thesis committees4. One researcher identified a major disconnect

1

Vaccari, D., & Thangam, S. (2010, June), A Proposed Doctoral Assessment Procedure And Rubric For Science And Engineering Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16106

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015