June 24, 2007
June 24, 2007
June 27, 2007
12.1481.1 - 12.1481.14
The Use of Direct and Indirect Evidence to Assess University, Program, and Course Level Objectives and Student Competencies
The Chemical Engineering Department at Brigham Young University (BYU) has partnered with BYU’s Institutional Assessment and Analysis unit to implement a number of assessment tools. These tools involve both direct and indirect evidence measures to assess university, program, and course level objectives and student competencies. Direct measurement tools include a mandatory-pass senior competency exam, instructor end-of-course proficiency evaluations, composite assessment of communication skills across several courses, and the California Critical Thinking Skills Test. Indirect tools include student end-of-course proficiency surveys, in-course minute paper surveys, the National Survey of Student Engagement, and university-conducted surveys of seniors, alumni, and employers.
This paper discusses a suite of direct and indirect assessment tools and their use to facilitate a comprehensive evaluation of student learning and of the learning environment necessary for a continuously improving educational process.
Assessment has the dual purpose of providing evidence that learning objectives are being met and providing feedback to guide the improvement of educational activities. A good assessment program utilizes a variety of tools to facilitate breadth and depth of analysis yet is efficient so that the time and effort spent on assessment is optimized.
In 2003 the Middle States Commission on Higher Education published a valuable guide on assessment entitled, Student Learning Assessment: Options and Resources.1 This guide discusses a variety of direct and indirect assessment tools, their strengths and limitations, and provides insight for the development of assessment programs.
Direct assessment measures are those which provide direct evidence that a learning objective has been met. Such evidence demonstrates the degree to which a student has mastered a particular subject, has acquired a specific skill, or developed a certain characteristic. These measures are most commonly applied at the course or program level, but can also be applied at the institution level. Examinations are by far the most common tools for direct assessment. Also valuable are portfolios of sample work such as writing samples and evaluations of oral presentations. Direct assessments are a necessary part of an assessment program, but they do not of themselves give a complete analysis. Direct assessments can show what was learned, but fail to show how or why the learning took place. Indirect measures are better suited to this task and are indispensable means of providing insight into the learning environment in order to improve the learning process.
Indirect measures typically focus on predictors that are correlated to learning, but do not measure learning itself. The most common indirect assessment tools are surveys which solicit input from
Terry, R., & Wilding, W. V., & Lewis, R., & Olsen, D. (2007, June), The Use Of Direct And Indirect Evidence To Assess University, Program, And Course Level Objectives And Student Competencies In Chemical Engineering Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--2564
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015