Honolulu, Hawaii
June 24, 2007
June 24, 2007
June 27, 2007
2153-5965
Civil Engineering
10
12.1525.1 - 12.1525.10
10.18260/1-2--2890
https://peer.asee.org/2890
338
Use of an Electronic Portfolio for Independent, Robust Direct Measurement of Student Outcomes Abstract
Assessment of student outcomes continues to evolve in the Department of Civil Engineering at Rose-Hulman Institute of Technology (RHIT). Direct assessment of outcomes is facilitated by the RosE Portfolio. The RosE Portfolio permits student submittals to an electronic system that sorts submittals by specified outcome, making the submittals available for later assessment by an independent team. The department uses an independent faculty and a practitioner for rating engineering submissions.
Use of the electronic portfolio is not without challenges. Student submittals must be made correctly to be fairly rated. The number of student submissions for rating must be adequate to assure reliable assessment of student performance, and the department must also identify successful and unsuccessful levels of student performance. However, used correctly and in conjunction with other indirect assessment, the electronic portfolio is a robust and flexible direct assessment of outcomes before graduation.
The paper summarizes • the assessment process used for learning outcomes • the RosE Portfolio submission process used by the department, • the process of assuring sufficient submittals for rating, • assessment of submittals, and • interpretation of data.
Implementation of the electronic portfolio has not been without some resistance within the department. The paper presents both advantages and disadvantages of this assessment tool along with advice on how similar assessment may be incorporated into other programs. The presentation at the ASEE annual meeting will be in a “point-counterpoint” format by two of the co-authors.
Introduction
A program of learning assessment should be an organized process of (1) identifying objectives consistent with the program mission, (2) development of measurable learning outcomes, (3) setting performance criteria (rubrics) for each outcome, (4) collecting evidence of learning, and (5) evaluating the evidence. This process should be re-evaluated on a regular basis for necessary changes or adjustments. Development of an effective program for assessment of student outcomes can present a challenge to civil engineering programs. Ideally, assessment of learning would be continuous, directly documenting each student’s activities and products during their baccalaureate work to assure achievement of all learning outcomes. This is, of course, impractical, so compromises are necessary for programs to assess learning. This paper provides a summary of the assessment program and facilitating tool in the Department of Civil Engineering at Rose-Hulman Institute of Technology.
Sutterer, K., & Hanson, J., & Houghtalen, R. (2007, June), Use Of An Electronic Portfolio For Independent, Robust Direct Measurement Of Student Outcomes Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--2890
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015