Asee peer logo

Studying the Reliability and Validity of Test Scores for Mathematical and Spatial Reasoning Tasks for Engineering Students

Download Paper |

Conference

2011 ASEE Annual Conference & Exposition

Location

Vancouver, BC

Publication Date

June 26, 2011

Start Date

June 26, 2011

End Date

June 29, 2011

ISSN

2153-5965

Conference Session

Assessment Instruments

Tagged Division

Educational Research and Methods

Page Count

13

Page Numbers

22.1352.1 - 22.1352.13

Permanent URL

https://peer.asee.org/18504

Download Count

21

Request a correction

Paper Authors

biography

Laura L. Pauley Pennsylvania State University, University Park

visit author page

Laura L. Pauley, Professor of mechanical engineering, joined the The Pennsylvania State University faculty in 1988. From 2000 to 2007, she served as the Professor-in-Charge of Undergraduate Programs in Mechanical and Nuclear Engineering. In 2003, Laura received the Penn State Undergraduate Program Leadership Award. Dr. Pauley teaches courses in the thermal sciences and conducts research in computational fluid mechanics and engineering education. She received degrees in mechanical engineering from University of Illinois (B.S. in 1984) and Stanford University (M.S. in 1985 and Ph.D. in 1988). She can be contacted at LPauley@psu.edu .

visit author page

biography

Jonna M. Kulikowich Pennsylvania State University

visit author page

Jonna M. Kulikowich is Professor of Education at Penn State, University Park. Her areas of research and teaching are in test development, applied psychometrics, and statistics. Professor Kulikowich's recent program of research focuses on academic development in mathematics and science. Specifically, she is interested in the measurement of reading comprehension and problem solving in mathematics and science.

visit author page

biography

Nell Sedransk National Institute of Statistical Sciences

visit author page

Dr. Nell Sedransk, Co-Pi, is the Associate Director of the National Institute of
Statistical Sciences (NISS) and Professor of Statistics at North Carolina State
University. She is an Elected Fellow of the International Statistical Institute,
also Elected Fellow of the American Statistical Association. She has served as
Associate Editor for the Journal of the American Statistical Association, the
Journal of Statistical Planning and Inference, and has been Vice-Chair of the
Publication Board of the American Statistical Association. The areas of her
technical expertise and current research include design of complex experiments,
Bayesian inference, spatial statistics and topological foundations for statistical
theory. She received her Ph.D. in Statistics in 1969 from Iowa State University.
She can be contacted at sedransk@niss.org

visit author page

biography

Renata S. Engel Pennsylvania State University, University Park

visit author page

Renata S. Engel is Associate Dean for Academic Programs and Professor of Engineering Design and Engineering Science & Mechanics. A member of the Penn State faculty since 1990, she served from 2000 - 2006 as the Executive Director of the Schreyer Institute for Teaching Excellence. Through various collaborative efforts, she has affected changes in the engineering curriculum at Penn State, primarily to incorporate elements of design in fundamental engineering courses. Engel earned a B.S. in engineering science at Penn State and Ph.D. in engineering mechanics at the University of South Florida. She can be contacted at rse1@psu.edu.

visit author page

Download Paper |

Abstract

Studying the Reliability and Validity of Test Scores for Mathematical and Spatial Reasoning Tasks for Engineering StudentsBackground and MotivationThe purpose of this paper is to continue a program of research in assessment and test design forthe measurement of three constructs that are keys to academic success in engineeringengineering. The constructs include abilities to: a) select mathematical applications relevant tosolving varied problems in engineering; b) translate two-dimensional images to three-dimensional and vice versa when solving engineering problems; and, c) understand how theengineering quantities (e.g. force, work, power, and flow rate ) are described by themathematical representations (e.g. integration, differentiation, or interpolation) presented instatics, dynamics, thermodynamics, and fluid mechanics.The researchers have designed several problem-solving measures in engineering where scoresfor the three constructs are hypothesized to predict academic success. This research studydemonstrates not only how psychometric models can be applied to study the contribution of theconstructs in problem solving but also how to improve the quality of items, option sets, andscoring keys to increase the reliability and validity of test scores. . Several psychometric andstatistical methods are employed to determine the value of a): scoring responses as partiallycorrect to gauge students’ misconceptions; b) evaluating the role of time relative to accuracy inthe completion of tasks; and, c) studying the relations among the three constructs given students’level of success in engineering courses. The item sets include multiple-choice and constructedresponse formats. Reliability and validity estimates for these two item formats are compared andcontrasted.Test Design StrategyCurrent design for all measures will include computer-based administration where responsetimes for test completion can be recorded along with interactivity for certain tasks (e.g.,manipulation of screen images). Items on the mathematical relevance test present a set ofengineering problems where a particular mathematical application is required to solve thoseproblems. The spatial-visualization task enables students to move between two- to-threedimensional displays to select an engineering principle that must be understood (e.g. linear orangular momentum, and mass flow rate) . The final test includes items that present a series ofmathematical functions in the stem that are required to specify the properties of one or morerelations among variables in physics. Students must select the response that establishes thecondition for why the mathematical application is needed given the properties of the physicsvariables.Data Source and Psychometric ModelingThe data source represents approximately 300 undergraduate students who have declared theirmajor in engineering. Item Response Theory (IRT) models are applied to examine test responsecharacteristics.. The researchers will illustrate how psychometric reports can provide evidencethat items,options, or constructed-response rating categories are contributing to the reliability ofscores... Test scores are then correlated with course grades to establish validity.Conclusions and SignificanceReliable and valid test scores are needed for domain-specific measures in engineering to not onlyprofile patterns of strength and weakness for students enrolled in various programs, but also totest instructional interventions that may facilitate academic progress. This presentationintroduces test design strategies and psychometric evaluation of scores of three variablesconsidered key to academic success in engineering.

Pauley, L. L., & Kulikowich, J. M., & Sedransk, N., & Engel, R. S. (2011, June), Studying the Reliability and Validity of Test Scores for Mathematical and Spatial Reasoning Tasks for Engineering Students Paper presented at 2011 ASEE Annual Conference & Exposition, Vancouver, BC. https://peer.asee.org/18504

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2011 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015