Pittsburgh, Pennsylvania
June 22, 2008
June 22, 2008
June 25, 2008
2153-5965
Engineering Technology
22
13.11.1 - 13.11.22
10.18260/1-2--4449
https://peer.asee.org/4449
542
Nirmal K. Das is an associate professor of Civil Engineering Technology at Georgia Southern University. He received a Bachelor of Civil Engineering degree from Jadavpur University, India, and M.S. and Ph.D. degrees in Civil Engineering (structures) from Texas Tech University. His areas of interest include structural analysis, structural reliability and wind engineering. Dr. Das is a registered professional engineer in Ohio and Georgia, and is a Fellow of the American Society of Civil Engineers.
A Case Study of Student Learning in Civil Engineering Technology
Abstract
The curriculum of the four-year, TAC/ABET accredited Civil Engineering Technology Program at Georgia Southern University covers three traditional areas within the discipline of Civil Engineering. These areas are environmental, structures, and transportation. In an effort to implement the continuous improvement plan for the program, assessment and evaluation of the program objectives and outcomes are being done on an ongoing basis. The term “assessment” means one or more processes that identify, collect, use and prepare data that can be used to evaluate achievement of program outcomes and educational objectives. The term “evaluation” characterizes one or more processes for interpretation of the data and evidence accumulated through assessment practices that (a) determine the extent to which program outcomes or educational objectives are being achieved; or (b) result in decisions and actions taken to improve the program. Use of multiple assessment tools and measures is imperative for (a) the program outcomes, i.e., knowledge and capabilities of students at the time of graduation and (b) the program objectives, i.e., the expected accomplishments of graduates during the first few years after graduation.
The purpose of this paper is to critically examine the assessment data collected for a specific component of the curriculum (structures), over at least two consecutive offerings (usually a year apart), and draw inferences as to the extent the related program outcomes are met. Three required courses, Structural Analysis, Steel Design and Reinforced Concrete Design, constitute the coursework in this particular area. Several assessment tools have been used, and most of them are direct measures. Various rubrics with benchmarks (set prior to data collection) have been used for meaningful assessment and evaluation. The paper discusses, for each of the three courses, corrective actions taken following the assessment of the first-year data, and also the changes, if any, that occurred in student learning as a result of incorporation of those changes at the subsequent offering.
I. Introduction
Execution of a viable continuous improvement plan (CIP) is essential for enhancement of a program. The two key elements of a CIP are assessment and evaluation. The term “assessment” means one or more processes that identify, collect, and analyze data that can be used to evaluate achievement of program outcomes and educational objectives. The term “evaluation” characterizes one or more processes for interpretation of the data and evidence accumulated through assessment practices that (a) determine the extent to which program outcomes or educational objectives are being achieved; or (b) result in decisions and actions taken to improve the program. The program educational objectives are defined as broad statements that describe the career and professional accomplishments that the program is preparing graduates to achieve during the first few years following graduation. The program outcomes are defined as statements that describe what units of knowledge or skill students are expected to acquire from the program to prepare them to achieve the program educational objectives. These are typically demonstrated
Das, N. (2008, June), A Case Study Of Student Learning In Civil Engineering Technology Paper presented at 2008 Annual Conference & Exposition, Pittsburgh, Pennsylvania. 10.18260/1-2--4449
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2008 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015