Asee peer logo

Concept Based Instruction And Personal Response Systems (Prs) As An Assessment Method For Introductory Materials Science And Engineering

Download Paper |

Conference

2005 Annual Conference

Location

Portland, Oregon

Publication Date

June 12, 2005

Start Date

June 12, 2005

End Date

June 15, 2005

ISSN

2153-5965

Conference Session

Useful Assessment in Materials Education

Page Count

10

Page Numbers

10.334.1 - 10.334.10

DOI

10.18260/1-2--14146

Permanent URL

https://peer.asee.org/14146

Download Count

476

Request a correction

Paper Authors

author page

Edward Goo

author page

Maura Borrego

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Concept-based Instruction and Personal Response Systems (PRS) as an Assessment Method for Introductory Materials Science and Engineering

Maura Jenkins and Edward K. Goo University of Southern California

Abstract Personal response systems (PRS) are gaining in use as a method to engage students in large science and engineering lectures. Faculty pose questions to the class mid-lecture and receive immediate feedback via remote-control “clickers” as to whether students understand the underlying concepts necessary to solve problems on homework and exams. Thus, the pace of the lecture can be adjusted accordingly to focus on the most difficult concepts.

This method has been thoroughly developed for introductory chemistry and physics courses. Pioneers have developed ConcepTests, or multiple-choice questions that focus on conceptual understanding, rather than calculation. These questions encourage peer interaction, as instructors allow students to vote a second time after discussing their initial answer with classmates. Introductory Materials Science and Engineering shares many characteristics of the courses in which this method has been successful; lectures are often large, the course is required, and many students are non-majors.

In this paper, we share our experience in applying this method to an introductory materials science course. We will present data on student responses, test scores, demographics, and comparison to previous semesters without the response systems. Plans to develop a common bank of materials ConcepTests, building on existing concept inventories will also be discussed. Practical details about the equipment and software will be shared as well.

Introduction Concept inventories, or multiple-choice exams focusing on 20-30 major concepts of a specific field, have recently experienced a surge in development as assessments independent of high-stakes testing. In recent years, concept inventories have been developed and tested for reliability in such fields as physics (mechanics)1, statics2, fluid mechanics3, materials4, and chemistry5. In developing these inventories, faculty focus on concepts and reasoning over computation, using varying degrees of rigor to distinguish between the two2. In many cases, the developers make use of open-ended responses from current students to develop distractors based on common misconceptions5. Reliability is tested by analyzing individual test items3 or administering the entire assessment to multiple groups of students4.

Proceedings of the 2005 American Society for Engineering Education Annual Conference & Exposition Copyright © 2005, American Society for Engineering Education

Goo, E., & Borrego, M. (2005, June), Concept Based Instruction And Personal Response Systems (Prs) As An Assessment Method For Introductory Materials Science And Engineering Paper presented at 2005 Annual Conference, Portland, Oregon. 10.18260/1-2--14146

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2005 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015