Asee peer logo

An Assessment Tool to Evaluate Student Learning of Engineering (Fundamental)

Download Paper |

Conference

2015 ASEE Annual Conference & Exposition

Location

Seattle, Washington

Publication Date

June 14, 2015

Start Date

June 14, 2015

End Date

June 17, 2015

ISBN

978-0-692-50180-1

ISSN

2153-5965

Conference Session

Fundamental: Metrics & Assessment for K-12 Engineering Education

Tagged Division

K-12 & Pre-College Engineering

Page Count

11

Page Numbers

26.177.1 - 26.177.11

DOI

10.18260/p.23516

Permanent URL

https://peer.asee.org/23516

Download Count

793

Request a correction

Paper Authors

biography

Tamara J Moore Purdue University, West Lafayette Orcid 16x16 orcid.org/0000-0002-7956-4479

visit author page

Tamara J. Moore, Ph.D., is an Associate Professor in the School of Engineering Education and Director of STEM Integration in the INSPIRE Institute at Purdue University. Dr. Moore’s research is centered on the integration of STEM concepts in K-12 and postsecondary classrooms in order to help students make connections among the STEM disciplines and achieve deep understanding. Her work focuses on defining STEM integration and investigating its power for student learning.

visit author page

biography

Siddika Selcen Guzey Purdue University, West Lafayette

visit author page

Dr. Guzey is an assistant professor of biology and biology education at Purdue University. Her research and teaching focus on integrated STEM Education.

visit author page

biography

James Holly Jr. INSPIRE Institute, Purdue University Orcid 16x16 orcid.org/0000-0002-4671-5277

visit author page

James Holly Jr. is a Ph.D. Student in Engineering Education at Purdue University. He received a B.S. from Tuskegee University and a M.S. from Michigan State University, both in Mechanical Engineering. His research interest is exploring formal and informal K-12 engineering education learning contexts. Specifically, he is interested in how the engineering design process can be used to emphasize the humanistic side of engineering and investigating how engineering habits of mind can enhance pre-college students’ learning abilities.

visit author page

Download Paper |

Abstract

An Assessment Tool to Evaluate Student Learning of Engineering (Fundamental)While STEM subjects have traditionally been taught separately in K-12 schools the newinitiatives share a focus on integrated approaches to teaching STEM. For example, the recentlyreleased Next Generation Science Standards (NGSS) addressed the need for explicit integrationof science with engineering. Science teachers are expected to teach intersecting concepts andcore disciplinary science using scientific and engineering practices. One challenge that manyscience teachers face is the lack of measurement instruments for student learning required in theNGSS. Measuring the student understanding of engineering, connections between science andengineering, and engineering practices requires the development of new assessments. Thepurpose of this study was to develop, scale, and validate an assessment in engineering. The useof item response theory to assess item functioning was a focus of the study. The work is part ofa larger project focused on increasing student learning in STEM-related areas in Grades 4 – 8through an engineering design-based, integrated approach to STEM instruction and assessment.The test construction process began with an assessment development team of academicresearchers with collective expertise in engineering, science, and mathematics. Researcherswrote 21 multiple-choice items and content and face validity of the items were established byexpert review. Next the assessment was piloted in two waves with the goal of identifying finalversions of each assessment via item analyses and scaling students’ responses to produce anestimate of their proficiency. In the first wave the assessment was administered to a small groupof about 10-20 middle school students. The purpose was to obtain and analyze preliminary itemresponse data as well as obtain feedback on characteristics of the assessments and the testingenvironment that affected performance (e.g., readability, clarity of items). Based on informationprovided by wave 1 piloting the assessments were modified by the assessment developmentteam. In the second wave of piloting the revised assessments were administered toapproximately 150 students. Data from wave two were used to conduct extensive item analysesof the assessments. We initially performed a factor analysis of the data for each assessment formiddle school students. The goal of these analyses was to assess the likelihood that itemsrepresented a single underlying construct. The results suggested that a single (major) factorunderlies the assessment, a finding that was robust to different methods of factoring (principalaxis, maximum likelihood) and factor rotation (varimax, oblique). These results suggests that thegoal of constructing items for the assessment that reflected a single factor was generally met, andalso helped to justify the use of item response theory (IRT) to generate proficiency scores forstudents. Next the Rasch IRT model was fitted to the data for each item on each assessment toestimate that item’s ability to contribute to estimates of student proficiency. The Rasch modeladequately fits the data of all 21 items. In this presentation, we will present the development and reliability of the engineeringassessment and share the assessment tool with the participants.

Moore, T. J., & Guzey, S. S., & Holly , J. (2015, June), An Assessment Tool to Evaluate Student Learning of Engineering (Fundamental) Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.23516

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015