Asee peer logo

Measurement And Evaluation In Engineering Technology

Download Paper |


2006 Annual Conference & Exposition


Chicago, Illinois

Publication Date

June 18, 2006

Start Date

June 18, 2006

End Date

June 21, 2006



Conference Session

TC2K Methods and Models

Tagged Division

Engineering Technology

Page Count


Page Numbers

11.915.1 - 11.915.7



Permanent URL

Download Count


Request a correction

Paper Authors


John Wise Pennsylvania State University

visit author page

John C. Wise is Director of Engineering Instructional Services at Penn State. In this capacity, he provides assistance to faculty members and teaching assistants in the areas of teaching, learning, instructional technology, and assessment. He received his B.A. in Liberal Arts from The University of the State of New York and his M.S. and Ph.D. in Instructional Systems at Penn State.
Address: 201 Hammond Building, University Park, PA 16802. Telephone: 814-865-4016, FAX: 814-865-4021, email:

visit author page


Dhaneshwar Lall Pennsylvania State University

visit author page

Dhaneshwar Lall is a doctoral candidate in Instructional Systems at Penn State University. He is currently the Assessment Coordinator for Engineering Technology programs at the Penn State campuses where he provides assistance to faculty members and administrators with regards to assessment, evaluation, and planning for accreditation of the various programs. He earned his B.S. degree in Chemistry from Hartwick College.
Address: 201 Hammond Building, University Park, PA 16802. Telephone: 814-865-3165, FAX: 814-865-4021, email:

visit author page


Dhushy Sathianathan Pennsylvania State University

visit author page

Dhushy Sathianathan is the Head of the School of Engineering Design, Technology, and Professional Programs (SEDTAPP) in the College of Engineering at Penn State University. He received his Ph.D. in Mechanical Engineering from Penn State University. He has led the development of the Engineering Entrepreneurship Minor, and the Center for Engineering Design and Entrepreneurship with external support from Boeing, General Electric (GE), and AT&T Foundation. He is a Boeing Welliver Faculty Fellow and the recipient of the Boeing Outstanding Educator Award, DOW Outstanding Faculty Award, Penn State Engineering Society Outstanding Teaching Award, and several Provost Awards for Curricular Innovation.
Address: 213-D Hammond Building, University Park, PA 16802. Telephone: 814-865-7589, FAX: 814-863-7229, email:

visit author page

Download Paper |

NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Measurement and Evaluation in Engineering Technology: M.E.E.T.


Preparation for compliance with TC2K for ABET accreditation is being carried out at twelve geographically-dispersed campuses which offer one or more of nine different engineering technology programs. An online system has been developed to aid in the collection of assessment data. This system makes use of embedded assessment measures and involves faculty at all levels in the ABET process. The system includes a suite of instruments measuring student performance, faculty perception, and student perception of performance vis-à-vis the program outcomes. This paper will describe the system in its current state and provide examples of data collected to date.

Background As all engineering technology educators are now aware, the Accreditation Board for Engineering and Technology (ABET) has changed its requirements for program accreditation. Where they once focused on facilities and inputs, the criteria are now learner-centered and performance-based.1 Each engineering and engineering technology program is required to develop learning outcomes and demonstrate student achievement through the assessment of student performance on these outcomes. The engineering technology programs at the large land-grant university system that is the focus of this paper are geographically dispersed throughout the state, making it difficult for faculty teaching the same courses to coordinate their efforts, to work together to develop program and course outcomes, and to coordinate the collection of assessment data.

Description of Solution

An assessment team made up of the department head, the director of Engineering Instructional Services, and a graduate research assistant worked with a representative group of faculty to develop an online system that standardizes data collection across the distributed campuses. Each of the nine programs developed specific, measurable outcomes and mapped them to appropriate courses. The assessment team built a database to maintain the outcomes and related data.

The final design incorporates six survey and survey-like instruments for data collection. Rather than seek questions and statements that serve as proxies for the program and course outcomes, all of these instruments deal directly with the outcomes themselves.2 In the following section, the three primary instruments that form the M.E.E.T. (“Measurement and Evaluation in Engineering Technology”) system will be described.

1. Student Performance. Faculty are presented with a list of their students, along with a list of the course-level outcomes associated with their course(s). They are asked to rate each student’s ability to perform each outcome using a 3-point scale (“Exceeded”, “Met”, “Not Met”). They are then asked to specify the evidence used to make this judgment, i.e. specific homework problem or specific lab

Wise, J., & Lall, D., & Sathianathan, D. (2006, June), Measurement And Evaluation In Engineering Technology Paper presented at 2006 Annual Conference & Exposition, Chicago, Illinois. 10.18260/1-2--870

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2006 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015