Asee peer logo

Teaching Students How To Evaluate The Reasonableness Of Structural Analysis Results

Download Paper |

Conference

2006 Annual Conference & Exposition

Location

Chicago, Illinois

Publication Date

June 18, 2006

Start Date

June 18, 2006

End Date

June 21, 2006

ISSN

2153-5965

Conference Session

NSF Grantees Poster Session

Tagged Division

Division Experimentation & Lab-Oriented Studies

Page Count

10

Page Numbers

11.1225.1 - 11.1225.10

Permanent URL

https://peer.asee.org/546

Download Count

33

Request a correction

Paper Authors

biography

James Hanson Rose-Hulman Institute of Technology

visit author page

Dr. James Hanson is an Assistant Professor of Civil Engineering at the Rose-Hulman Institute of Technology. He teaches mechanics courses for the freshman through senior levels including structural analysis and design. He is a strong advocate of hands-on learning and problem-based learning. He is a licensed professional engineer. He has also taught at Cornell University and Bucknell University.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Teaching Students How to Evaluate the Reasonableness of Structural Analysis Results

Abstract Structural engineers, and engineers in general, depend heavily on software to assist in complex analyses of large problems. As the size and complexity of a problem increases, however, the potential for errors and the devastating impacts of those errors increase. Unfortunately, few faculty teach undergraduate students how to evaluate the reasonableness of their structural analysis results. Therefore, the National Science Foundation has funded a project to develop a version of undergraduate structural analysis course that teaches students to not only generate structural analysis results, but also to evaluate those results for reasonableness.

The author has interviewed practicing structural engineers to determine the methods they use to evaluate structural analysis results. The data from the interviews have been blended into a new version of the undergraduate structural analysis course. A comparison of syllabi from the old and new versions of the course shows that teaching evaluation of results can have minimal impact on the time spent on each topic on the syllabus.

The methods being incorporated into the new version of the course focus on simplifying situations into problems that can be easily solved and on anticipating features of complex solutions. This paper summarizes the methods incorporated in this course and provides several examples.

In exit interviews, students in both the old and new versions of the course expressed similar attitudes when asked about the existence of reasonable answers and the importance of evaluating results. Therefore, students already believed that reasonable answers can be obtained, and that it is important to evaluate their answers. However, when students taking both the old and new versions of the course were tested at the end of the course to measure their ability to evaluate the reasonableness of structural analysis results, the students in the new version of the course showed a measurable increase in ability to identify the most reasonable answer and to explain why it was the most reasonable answer.

Introduction Categories of Errors To help students learn methods for evaluation, the course begins with a description of four categories of errors in structural analysis and design: idealization of the real structure, assumptions inherent to the analysis method or design equations, roundoff error, and human error.

1. Idealization of the real structure. This category includes all of the assumptions we intentionally make in order to model a structure. Some examples include assuming unrestrained rotation at every joint of a truss, exactly straight members, or perfectly rigid diaphragms. Fortunately, many of the errors induced by the idealization of the structure have a relatively small impact. The load and strength reduction factors used in design standards

Hanson, J. (2006, June), Teaching Students How To Evaluate The Reasonableness Of Structural Analysis Results Paper presented at 2006 Annual Conference & Exposition, Chicago, Illinois. https://peer.asee.org/546

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2006 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015