Asee peer logo

Assessing The Impact Of Innovative Me Courses: Creating And Validating Tools

Download Paper |

Conference

2007 Annual Conference & Exposition

Location

Honolulu, Hawaii

Publication Date

June 24, 2007

Start Date

June 24, 2007

End Date

June 27, 2007

ISSN

2153-5965

Conference Session

Emerging Trends in Engineering Education Poster Session

Page Count

23

Page Numbers

12.277.1 - 12.277.23

DOI

10.18260/1-2--1910

Permanent URL

https://sftp.asee.org/1910

Download Count

369

Request a correction

Paper Authors

author page

Elise Amel University of Saint Thomas

biography

Camille George University of St. Thomas

visit author page

Dr. George is an Assistant Professor in mechanical engineering at the University of St. Thomas. She teaches the core course in thermodynamics and has received outstanding student evaluations on her engaging teaching style. She maintains a strong interest in technology literacy and educating the general public. Professor George has prepared several innovative courses. She has taught a course specifically about fuel cells that mixed senior engineering students with students from other disciplines and adult learners (non-engineers). Professor George has also spearheaded several international service-learning projects in Haiti and Mali. These innovative projects included students from the department of Modern and Classical Languages, the communication studies department and the engineering program for an interdisciplinary year-long effort.

visit author page

author page

Yvonne Ng College of St. Catherine

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Assessing the Impact of Innovative ME Courses: Creating and Validating Tools

Abstract

The goal of this research was to devise three measurement tools to assess the effectiveness of laboratory innovations for undergraduate engineering courses. The first tool was devised to measure attitudes and impressions about module content and delivery as well as attitudes toward engineering in general. We included a novel method for evaluating attitude by developing an adjective checklist that varied by gender (adjectives were masculine or feminine, negative, positive, or neutral). This was intended to gauge whether mechanical engineering, usually perceived to be masculine in nature, would gain a more gender-balanced image through innovative laboratory experiences. The second tool utilized conceptual questions (requiring no formal calculations) in a pretest-posttest format to determine whether students learned the laws of thermodynamics. The third tool took the form of a behavioral rubric designed to assess whether and how well students demonstrate knowledge of thermodynamics in subsequent settings such as internships and advanced mechanical engineering courses. Content validation of all measurement tools was conducted using engineering experts. Methodological strategies and challenges will be discussed.

Assessment Needs

Recently, a new course, Engineering in Your World (EYW)1, which fulfills the general education requirement for a science lab course, was developed at the College of Saint Catherine, and the course content for Thermodynamics2 at the University of St. Thomas was revised. The revisions were in the spirit of the liberal arts and included hands-on and group activities3, a focus on minimizing negative environmental impact4, consideration of social consequences5, and to challenge student stereotypes of a typical engineer6. In order to assess these innovations we decided to measure effectiveness on multiple levels: attitudinal, learning, and behavior change7. Attitudes toward a course are typically measured by student evaluations at the end of a course. Often these measures are standard across disciplines and, thus, are unable to capture information that speaks to specific course goals. Learning is typically measured by quizzes and test after the information is covered in class. Behavior change as a result of the course is usually assumed rather than measured.

We began the study described in this paper in 2004 by completing literature reviews in three fields of research (psychology, education, and engineering) before concluding that we needed to develop our own tools if we wanted to attempt to truly identify change. Essentially there were no standard, validated tools that met our needs. Thus, we set out to create and validate our own.

Effectiveness measures were developed on three levels: attitude, learning, and behavior: Pre- and post-course attitude surveys for students Pre- and post-course learning tests (calculation based and conceptual versions) Behavior rubric used by supervisors and faculty

Amel, E., & George, C., & Ng, Y. (2007, June), Assessing The Impact Of Innovative Me Courses: Creating And Validating Tools Paper presented at 2007 Annual Conference & Exposition, Honolulu, Hawaii. 10.18260/1-2--1910

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2007 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015