Asee peer logo

Agile Education: What We Thought We Knew About Our Classes, What We Learned, And What We Did About It

Download Paper |

Conference

2008 Annual Conference & Exposition

Location

Pittsburgh, Pennsylvania

Publication Date

June 22, 2008

Start Date

June 22, 2008

End Date

June 25, 2008

ISSN

2153-5965

Conference Session

FPD9 - First Year Learning & Assessment

Tagged Division

First-Year Programs

Page Count

23

Page Numbers

13.164.1 - 13.164.23

DOI

10.18260/1-2--3967

Permanent URL

https://peer.asee.org/3967

Download Count

494

Paper Authors

biography

Richard Whalen Northeastern University

visit author page

Richard Whalen, Susan Freeman and Beverly Jaeger are members of Northeastern University's
Gateway Team, a selected group of faculty expressly devoted to the first-year Engineering
Program. The focus of this team is on providing a consistent, comprehensive, and constructive
educational experience in engineering that endorses the student-centered and
professionally-oriented mission of Northeastern University.

visit author page

author page

Susan Freeman Northeastern University

biography

Beverly Jaeger Susan Freeman and Beverly Jaeger are members of Northeastern University's

visit author page

Gateway Team, a selected group of faculty expressly devoted to the first-year Engineering Program. The focus of this team is on providing a consistent, comprehensive, and constructive educational experience in engineering that endorses the student-centered and professionally-oriented mission of Northeastern University.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Agile Education: What We Thought We Knew About our Classes, What We Learned, and What We Did About It

Abstract In a continuing effort to improve a first-year design course, a team of faculty has evaluated a variety of learning modes over a two-year period by surveying both the student and faculty populations on the learning potential of each of these modes and on the degree to which each mode is interesting or engaging. Following the first year of the study, efforts were made to address learning modes which were rated low for both categories of learning potential and level of engagement. This paper presents the results of the survey administered in the second year and assesses the effectiveness of changes made to some learning modes. In addition to the student survey results, instructing faculty personal opinions of learning potential and level of engagement for each mode are included along with faculty predictions of how the students would respond from their learner’s perspective. The data was used to establish how well we as educators know our students. Results were evaluated to determine if (a) our prediction for an activity makes a difference in how the students rate a learning mode for learning potential and level of engagement and (b) if any mismatch exists in what we think and what they rate. This work provides examples of the student and faculty surveys, proposes solutions, provides assessment to components and modes presently not hitting the mark, and discusses the results of the faculty opinion survey. The hope is that other educators may identify with these outcomes, use similar tests to judge their student population, and use the results to make helpful adjustments to course content.

Introduction As educators, much of what we formulate and choose to apply in the classroom with regard to learning activities is usually measured against the potential of each method to teach a concept. In many instances, whether or not the activity will engage the student is secondary to the primary objective –retention of the lesson. Of course, we would prefer to use activities that have a substantial level of engagement as well as a high learning potential but this simultaneous effect is not always possible. Even learning modes with high engagement levels are no guarantee that the experience will educate students in the most effective way. Therefore, for any course to evolve to its fullest potential, we must also assess each of the learning modes, or activities, used for its level of engagement as well as its potential for learning. The natural response to any educational assessment is to consider modifications in accordance with the feedback obtained.

The original research initiative, conducted by a team of faculty at Northeastern University established that our existing first-year design course format was effective from a learning assessment perspective. The course had passed through multiple iterations over an eight-year period, undergoing incremental changes with positive results5. It was then time to take a new look at the course. Subsequent research was conducted and presented at ASEE, in which the same team of faculty investigated whether or not high classroom engagement with a variety of learning activities equated to a significant amount of learning for the student. On the survey, the engagement element was defined for the students as follows: “The Interest portion is not merely about how fun the activity is compared to entertainment, but how engaging or interesting it is compared to other classroom teaching options.” Similarly the concept of learning value was described as follows: “The learning rating [of a particular mode or activity] is not merely about the percentage or amount you learned, but how well it helped you to learn the concept/topic at hand.” The learning activities in the study represented various modes of learning which primarily included active learning, service learning, problem-based learning, and case-based learning. In the previous work, the effectiveness of each learning mode was obtained by surveying each student on the self-reported amount learned and on the degree to which each class experience was interesting or engaging.

Whalen, R., & Freeman, S., & Jaeger, B. (2008, June), Agile Education: What We Thought We Knew About Our Classes, What We Learned, And What We Did About It Paper presented at 2008 Annual Conference & Exposition, Pittsburgh, Pennsylvania. 10.18260/1-2--3967

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2008 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015