Asee peer logo

Building a Concept Inventory for Numerical Methods: A Chronology

Download Paper |

Conference

2016 ASEE Annual Conference & Exposition

Location

New Orleans, Louisiana

Publication Date

June 26, 2016

Start Date

June 26, 2016

End Date

June 29, 2016

ISBN

978-0-692-68565-5

ISSN

2153-5965

Conference Session

NSF Grantees Poster Session II

Tagged Topic

NSF Grantees Poster Session

Page Count

15

DOI

10.18260/p.26401

Permanent URL

https://peer.asee.org/26401

Download Count

613

Request a correction

Paper Authors

biography

Autar K. Kaw University of South Florida

visit author page

Autar Kaw is a professor of mechanical engineering at the University of South Florida. He is a recipient of the 2012 U.S. Professor of the Year Award from the Council for Advancement and Support of Education and Carnegie Foundation for Advancement of Teaching. The award is the only national program to recognize excellence in undergraduate education.

Professor Kaw received his BE Honors degree in Mechanical Engineering from Birla Institute of Technology and Science (BITS) India in 1981, and his degrees of Ph.D. in 1987 and M.S. in 1984, both in Engineering Mechanics from Clemson University, SC. He joined University of South Florida in 1987.

Professor Kaw’s main scholarly interests are in engineering education research, open courseware development, bascule bridge design, fracture mechanics, composite materials, and the state and future of higher education.

Funded by National Science Foundation (2002-16), under Professor Kaw's leadership, he and his colleagues from around the nation have developed, implemented, refined and assessed online resources for an open courseware in Numerical Methods (http://nm.MathForCollege.com). This courseware annually receives 1,000,000+ page views, 1,000,000+ views of the YouTube lectures, and 120,000+ visitors to the "numerical methods guy" blog.

Professor Kaw has written more than 85 refereed technical papers and his opinion editorials have appeared in the Tampa Bay Times, Tampa Tribune and Chronicle Vitae. His work has been covered/cited/quoted in many media outlets including Chronicle of Higher Education, Inside Higher Education, U.S. Congressional Record, Florida Senate Resolution, ASEE Prism, and Voice of America.

visit author page

biography

Yingyan Lou Arizona State University

visit author page

Dr. Yingyan Lou is an assistant professor in the Civil, Environmental, and Sustainable Engineering program in the School of Sustainable Engineering and The Built Environment Engineering at Arizona State University. She holds a B.S. and a B.A.Econ degree from Beijing University, and received her M.S. and Ph.D. degrees in Civil and Coastal Engineering from the University of Florida. Before ASU, she worked at the Department of Civil, Construction and Environmental Engineering at the University of Alabama.

Dr. Lou is very passionate about teaching and education research. In her teaching, she always emphasizes not just the “how” but also the “why” by providing background information on broader issues of the discipline and insights into theories and procedures. Dr. Lou has introduced active learning technologies (such as Clickers) to engage students more effectively during lectures and in-class examples. She also participated in a dissertation study about active learning in engineering disciplines when teaching at The University of Alabama.

visit author page

biography

Andrew Scott Alabama A&M University

visit author page

Andrew Scott has been a faculty member with the Department of Electrical Engineering and Computer Science at Alabama A&M University, Huntsville, since 2002. He has a strong background in high-performance scientific computing, including algorithms and numerical analyses on parallel and distributed systems. He has expertise in the following areas: Field Programmable Gate Arrays for reconfigurable computing applications, software development for heterogeneous computing environments, domain decomposition, process mapping and data structuring techniques for distributed platforms, and finite element analysis. He holds both BS and MS degrees in mechanical/aerospace engineering from the University of Missouri, Columbia, and PhD in computer science and engineering from the University of Missouri, Kansas City.

visit author page

biography

Ronald L. Miller Colorado School of Mines

visit author page

Ronald L. Miller is professor emeritus of chemical engineering at the Colorado School of Mines, where he taught chemical engineering and interdisciplinary courses and conducted engineering education research for 28 years. Miller received three university-wide teaching awards and held a Jenni teaching fellowship at CSM. He received grant awards for education research from the National Science Foundation, the U.S. Department of Education FIPSE program, the National Endowment for the Humanities, and the Colorado Commission on Higher Education and published widely in engineering education literature. His research interests include measuring and repairing engineering student misconceptions using well-constructed concept inventories.

visit author page

Download Paper |

Abstract

A concept inventory (CI) instrument allows instructors to measure student misconceptions and address them in a course. The instrument is typically a multiple-choice question test designed to assess the understanding of concepts by a student. The questions focus on reasoning and logic rather than on declarative knowledge of the subject matter.

In this paper, we chronicle the development of a concept inventory in Numerical Methods for Engineers as part of a current NSF TUES grant. We focus on the timeline so that readers can themselves follow the intricate process of developing a concept inventory - identifying concepts through experts using DELPHI technique, developing questions, assessing individual questions, and testing for reliability and validity.

February 2014 – March 2014: A workshop was conducted by an expert who has been the chief developer of three concept inventories. The workshop was attended by the three project PIs and two external members of the evaluation team of the grant. The four-hour workshop was administered via two online sessions. The purpose of the workshop was to: 1) identify key concepts and important misconceptions in the domain of numerical methods, 2) review steps required to develop a valid and reliable concept inventory, 3) write reliable and valid concept items for each concept, and 4) decide how to collect and analyze pilot data to measure effectiveness of inventory items (questions and distractors).

March 2014 – June 2014: The PI invited numerical methods instructors from different engineering majors and with varied experience to join a team which would participate in a Delphi methodology to identify the 5-10 most important concepts in Numerical Methods. Thirteen instructors accepted the invitation including the three project PIs. The process was conducted anonymously by the CI expert and took 4 rounds of ranking and discussion to come up with top six concepts. As an example, one concept chosen was “to demonstrate the deep relationship of Taylor series to numerical methods such as derivation of methods, error analysis, and order of accuracy.”

June 2014 – November 2014: The three project PIs developed question stems for each of the six concepts. We drafted 32 questions – with at least 5 questions for each concept. Most questions were written as fill-in-the-blank questions to gather student responses for distractors and others were designed as multiple-choice since they would have otherwise been leading questions. The questions were also answered in a talk-aloud format by two teaching assistants and two students who had recently completed a numerical methods course. Changes were made on the wording of some of the questions based on their feedback.

November 2014 – December 2014: Two first draft tests were developed with 16 questions each to accommodate the class period time of 50-75 minutes. Data were gathered at University A (a large urban university in the Southeast) and University B (a large urban university in the Southwest).

December 2014 – March 2015: Student responses were collated and point-biserial correlation coefficients (a measure of discrimination) and difficulty index values (percentage of test takers who answer a question correctly) were calculated for each question. This allowed us to refine the inventory by identifying questions that were acceptable, those that were for re-testing after revision of , and those that were outright inadequate. The concept inventory along with the statistical data was studied by the CI expert to make sure that we were following the correct process. A few new questions were added to ensure at least 4 questions in each of the 6 topics.

April 2015 – August 2015: Data were now gathered from University B and University C (a historically black university). Using the data from all the questions and implementations, 24 questions (4 in each of the 6 topics) were chosen for the final draft of the concept inventory.

November 2015 – December 2015: The 24-question concept inventory will be tested at University A and University B. This will provide a larger sample size of about 150 at the two institutions. We will continue to determine the point-biserial correlation coefficient and difficulty index for each question and will also compute the Cronbach Alpha for measuring the reliability of the instrument. Validity will be measured using the score on the common multiple-choice question part of the final examination given at Universities A, B and C. These results will be part of the first draft of the paper. The final version of the concept inventory will be available at the time of the conference.

Kaw, A. K., & Lou, Y., & Scott, A., & Miller, R. L. (2016, June), Building a Concept Inventory for Numerical Methods: A Chronology Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.26401

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015