Asee peer logo

Best Paper PIC IV: The Use of Inquiry-based Activities to Repair Student Misconceptions Related to Heat, Energy, and Temperature

Download Paper |


2012 ASEE Annual Conference & Exposition


San Antonio, Texas

Publication Date

June 10, 2012

Start Date

June 10, 2012

End Date

June 13, 2012



Conference Session

NEW THIS YEAR! - ASEE Main Plenary II: Best Paper Recognition & Industry Day Session: Corporate Member Council Speaker

Tagged Topics

ASEE Board of Directors and Corporate Members Council

Page Count


Page Numbers

25.256.1 - 25.256.15

Permanent URL

Download Count


Request a correction

Paper Authors

author page

Michael J. Prince Bucknell University

Download Paper |


AC 2011-407: THE USE OF INQUIRY-BASED ACTIVITIES TO REPAIRSTUDENT MISCONCEPTIONS RELATED TO HEAT, ENERGY AND TEM-PERATUREMichael J. Prince, Bucknell University Michael Prince is Professor of Chemical Engineering at Bucknell University. His current research ex- amines the use of inquiry-based activities to repair student misconceptions in thermodynamics and heat transfer. He is co-director of the ASEE National Effective Teaching Institute. Address: Department of Chemical Engineering, Bucknell University Lewisburg, Pennsylvania 17837. E-mail: A. Vigeant, Bucknell University Margot Vigeant is an Associate Professor of Chemical Engineering, with research interests in engineering education, thermodynamics concepts, and bioprocess engineering. She is currently also an Associate Dean in the College of Engineering.Dr. Katharyn E. K. Nottis, Bucknell University Katharyn E. K. Nottis is an associate professor in the Education department at Bucknell University. An Educational Psychologist, her research has focused on meaningful learning in science and engineering education, approached from the perspective of Human Constructivism. She has been involved in collabo- rative research projects focused on conceptual learning in chemistry, seismology, and chemical engineer- ing. c American Society for Engineering Education, 2011Introduction There is broad recognition that meaningful learning requires that students master fundamentalconcepts. Understanding concepts and the connections among concepts is one of the primary distinctionsbetween experts and novices (Bransford et al., 2000; Chi, 2006;). Conceptual understanding is also aprerequisite for students to transfer what they have learned in the classroom to new settings, somethingthat is arguably among the most significant goals of an engineering education. While there is little disagreement about the importance of conceptual learning, a wealth ofevidence drawn from decades of research in the sciences (Lightman et al., 1993; Laws et al., 1999; Chi etal., 2005; Reiner et al., 2008) and a growing literature in engineering (Prince et al., 2010; Prince et al., inreview; Krause et al., 2003; Steif et al., 2005; Miller et al., 2006; and Streveler et al., 2008) demonstratesthat students generally enter our classrooms with misconceptions and that traditional instruction is oftenineffective for promoting sizeable conceptual change. Addressing this problem requires a paradigm shiftin teaching methods, from a paradigm of “teaching by telling” to one that more directly engages studentsat a conceptual level and lets them actively construct new meanings. Research, much of it in the sciences,has successfully demonstrated that a range of student centered instructional techniques can significantlyimprove students’ conceptual learning gains (Hake, 1998; Laws et al., 1999; Reddish et al., 1997; andMazur, 1997). There is a small but growing body of literature in engineering that supports similarconclusions (Prince et al., 2006, 2009). Several factors explain why engineering education has not yet fully capitalized on the research,primarily in physics, for addressing student misconceptions. These factors include (1) the unfamiliarityof the relevant education literature to many engineering educators, (2) the lack of concept inventories withgood estimates of internal consistency and validity that address core engineering areas and (3) the lack oftested educational materials in engineering similar to those that have been developed and tested inphysics. However, significant progress is happening related to each of these issues. There is awidespread and rapidly growing awareness of the benefits of active-engagement methods in engineeringeducation (Prince, 2004) and significant progress has been made in developing concept inventories forcore engineering topics (Evans, 2003; Reed-Rhoads and Imbrie, 2007; Streveler et al., 2008). The lack ofestablished educational materials specifically designed to repair important misconceptions in the coredisciplines of engineering is arguably the predominant missing piece. This work seeks to help addressthat gap by developing inquiry-based activities to address four targeted student misconceptions in the areaof heat transfer. The paper begins by providing background information on conceptual change models andmethods, illustrating both that misconceptions can be resistant to change while also identifyinginstructional approaches that have demonstrated success in other contexts. This is followed by adiscussion of the research methodology, including a description of the sample demographics. Finally,results on the effectiveness of the developed inquiry-based activities for enhancing student learning arepresented, along with a brief discussion of future work.Background:Conceptual Change Models and Methods It is important to differentiate situations where learning is more easily acquired from robustmisconceptions that are resistant to change. Chi (2008) distinguishes conceptual change from othersituations based on students’ preexisting knowledge. In those cases where students have either no priorknowledge or correct but incomplete prior knowledge, learning involves adding new information.However, in cases where students enter the classroom with significant misconceptions, learning requireschange. Bransford et al. (2000) similarly stress the importance of understanding the state of students’preexisting knowledge when designing instruction. Significant research shows that conceptual change isdifficult for a number of reasons. Ozdimir and Clark (2007) provide a good overview of conceptualchange theories and Streveler et al. (2008) provide a targeted overview of conceptual learning inengineering. While conceptual change is difficult, a number of approaches have shown promise for promotingconceptual learning relative to traditional instruction. Most of those approaches are active engagementmethods and many are inquiry-based. Bernhard (2000) provides a good overview of the range of inquiry-based approaches that have been developed for physics education including Physics by Inquiry, PeerInstruction, Real Time Physics, Tools for Scientific thinking and workshop Physics. Prince and Felder(2006, 2007) provide extensive evidence that a variety of inquiry-based instructional methods areeffective for promoting conceptual understanding as well as additional educational outcomes. Theframework adopted for the activities presented in this study drew heavily on the Workshop Physicsmodel, the defining elements of which (Laws et al., 1999) are shown in Table 1. Table 1: Elements of Inquiry-Based Activity Modules (Laws et al 1999) (a) Use peer instruction and collaborative work (b) Use activity-based guided-inquiry curricular materials (c) Use a learning cycle beginning with predictions (d) Emphasize conceptual understanding (e) Let the physical world be the authority (f) Evaluate student understanding (g) Make appropriate use of technology (h) Begin with the specific and move to the generalIdentifying Critical engineering Concepts and Misconceptions Misconceptions related to heat, energy and temperature are widely recognized in the literature(Carlton, 2000; Jasien and Oberem, 2002; Thomas et al., 1995; Sozbilir, 2003). This study focuses onfour targeted concept areas related to heat transfer that were identified from previous research as beingboth important and difficult for students to understand (Nottis et al., 2009; Prince et al., 2009; Streveler etal., 2003): (1) temperature vs. energy, (2) temperature vs. perceptions of hot and cold, (3) factors thataffect the rate vs. amount of heat transferred and (4) the effect of surface properties on thermal radiation.Developing Inquiry-Based Activity Modules for Targeted Misconceptions Inquiry-based activities to address the targeted misconceptions were modeled after thosedeveloped by the Activity-Based Physics group (Laws et al., 1999; activities based physics webpage).This approach is similar to that proposed by others (Hausfather, 1992, Thomas et al., 1995) and hasextensive empirical support (Laws et al., 1999; Thacker et al., 1994; Thomas et al., 1995) Letters inparentheses in the following description of the activities refer to the elements of Table 1 in order todemonstrate the consistency of the approach employed here with the methods described in Table 1.Students were put in teams (a) and asked to predict what would happen in a number of scenarios (c). Asample scenario is shown in Appendix 1. The students were then given physical experiments and/orcomputer simulations to test their predictions (b, e, g), after which they were asked to discuss how theirthinking had changed if their predictions did not match reality. All the questions were conceptual innature (d, f), using technology where appropriate (g). At the end of the specific activities, students wereasked to step back and generalize what they had learned from the specific experiments and in some caseswere asked to extend that knowledge to a novel application in order to determine if the learning wastransferable to a new situation (h).Methodology This exploratory study examined the effect of 8 inquiry-based activities for improving students’conceptual understanding in 4 targeted concept areas using the newly developed Heat and EnergyConcept Inventory (HECI). The instrument was designed specifically to assess these specific conceptareas and has demonstrated acceptable levels of internal consistency reliability and content validity(Prince et al., 2011). A quasi-experimental design with intact groups was used to assess learning gains. The twogroups were a test group that were given the activities and a control group that was not. Participantscompleted a computerized version of the HECI prior to and after instruction. Detailed instructions wereprovided as a cover page to both the faculty administering the concept inventory and students completingit. The instructions specified that the pre-test was to be conducted within the first two weeks of thesemester and that the post-test should be completed within the last two weeks of instruction. Testconditions were specified to standardize the students’ experience. They included that the conceptinventory should be completed individually within one hour without the assistance of any referencematerials. Instructors were encouraged to provide some modest grade incentive, such as awarding ofbonus points, for students to make a serious effort to answer the questions completely. Instructors weregiven the flexibility to administer the instrument either in or out of a regularly scheduled class period.Students were told that none of the questions on the instrument were purposely designed to mislead themand were instructed to make their best effort to answer all questions. Measurements for the control groupassessed pre/post changes on the HECI under normal conditions, that is, without the use of the activities.Student learning gains for this sample were compared to gains found for a test sample of students whoexperienced the activities in their heat transfer course. Descriptive statistics examined changes in knowledge, as measured by the mean scores ofparticipants on the entire concept inventory as well as in each conceptual area sub-test. Independent t-tests were used to examine the differences between pre and post-test scores of the two groups (e.g.,difference in pre-test scores of control and test groups). Dependent t-tests were used to examine pre-postlearning differences for both the control group without activities and for the test group with activities.Normalized gains were also used to compare the groups. In addition, effect sizes, using Cohen’s d, werecalculated to show the magnitude of the difference between the means of each group. The appropriatemeasure of effect size for t-tests is Cohen’s d (Cronk, 2010).Demographics The HECI was administered as a pre-test of existing knowledge to a control group of 373undergraduate engineering students at ten different universities or colleges. The selection of schoolsincluded geographically diverse private and public institutions from across the United States, ranging intotal enrollment from approximately 2,000 to 40,000 students. The concept inventory was used in 11course offerings, two of which were offered at the same institution in two different semesters. Of the 373respondents, 344 completed the concept inventory again after instruction in a heat transfer course. The test group consisted of a sample of 129 students at 4 undergraduate institutions. The HECIwas administered as a pre-test of existing knowledge to this group. Of the 129 respondents, 116completed the concept inventory again after instruction that included administration of the inquiry-basedactivities. Demographic information for both student samples is shown in Table 2. An example inquiry-based activity is shown in Appendix 1. Sample instructions to faculty are shown in Appendix 2. Asdescribed earlier, each activity was designed to incorporate each of the elements of inquiry-basedactivities as defined by Table 1. There were 8 activities tested in this study, two targeting each of the fourconcept areas of the HECI. Students at each institution used all of the activities.Table 2: Demographics of Student Samples for both Control and Test GroupsControl Group (No Activities) Test Group (Activities)Totals:N = 373 (pre), 344 (post) N=129 (pre), 116(post)Gender:73.4% Male, 26.6% Female 76.0% Males, 24.0% FemaleEthnicity: Ethnicity:80.9% white, 9.8% Pacific Islander, 2.9% African 83.7% white, 7.0% Pacific Islander, 0.8% AfricanAmerican, 2.4% Hispanic American, 3.1% Hispanic, 0.8% Multiracial, 4.7% otherAcademic Major: Academic Major:39.5% chemical engineering, 47.4% mechanical 51.2% chemical engineering, 36.4% mechanicalengineering, 3.7% civil engineering, 0.3% engineering, 2.3% civil engineering, 10.1%environmental engineering, 9.2% “other” “other”Class Year Class Year30.2% Seniors, 60.5% juniors, 7.9% sophomores, 14.0% Seniors, 65.9% juniors, 20.2% sophomores0.3% graduate studentsResults An independent t-test showed no significant difference between the test group using theinquiry-based activities and the control group on the total pre-test scores, (t(487) = 1.454, p>0.05). Pairedsamples t-tests showed that there was a statistically significant improvement from pre- to post-test scoresfor both the test and the control groups. A summary of the results as assessed by pre/post measurementsusing the HECI for both the control and test groups is shown in Table 3. As can be seen from the table,while student learning gains in the control group were statistically significant, they were modest (t(336) =-7.737, p< 0.05 level.** Statistically significant at the p < 0.01 level. One conventional measurement used in much of the conceptual change studies involving physicsstudents is the normalized gain, defined as the improvement in student scores normalized by the possiblegain. For example, the normalized gain for students in the control group is 10.4%, calculated by lookingat the measured gain of 5.3% (54.5%-49.2%) divided by the total possible gain of 50.8% (100%-49.2%).This can be compared to a normalized gain of 44% with activities. A chart comparing normalized gainson the instrument as a whole as well as for each of the sub-categories of the HECI is shown in Figure 1.The data shows that the activities improved student learning gains in each of the four targeted conceptareas as well as for the overall. These gains are significant, both statistically and in absolute terms. Finally, an independent t-test was used to examine the differences in post-test scores between thecontrol and the test (inquiry activity) groups. Effect sizes were also calculated to characterize themagnitude of the difference. The test group scored significantly higher than the control group on thepost-test (t=-8.33, p, Physics Education, 30 (1), 19-26.  Appendix 1: Sample ActivityInquiry-Based Activity 1: Cooling Beverages with Crushed Vs. Block IceIntroduction: In this activity you will be adding the same mass of ice, either as a solid block of ice or as crushedice, to a beaker of water and recording the rate and amount of cooling provided by each option. You areasked to make predictions and compare those predictions to what happens, along with answering somefollow-up questions.Materials: 2 1-liter beakers 2 magnetic stirrers with stir bars for mixing contents of the 1-liter beakers Crushed ice (approximately 1 liter) Small trays on which to weigh out ice Scale to weigh approximately 40 grams of ice. Food coloring (optional; if used, try to match to colors of data logging software) Computer with data logging software (such as Vernier Labpro) to record temp. vs. time 2 temperature sensors specific to the data logging software being usedNote: The experiment can be run with thermometers and watches if more advanced data acquisitionequipment is not available. Students can simply record temperature vs. time manually for each system.Directions: 1. Predict which system will cool the ice to a lower temperature and which will cool the beverage more quickly. Make and record these predictions below, without talking to your lab partners or classmates. A. Which option will cool the water to a lower temperature? Why? (Answer in space below) B. Which option, if either, will cool the water more quickly? Why? (Answer in space below) 2. Place 1 beaker on each of the magnetic stirrers and place a magnetic stir bar in each beaker. Set each stirrer so that each provides the same degree of agitation. 3. From a common container, pour approximately 600 ml. of room temperature water into each of the beakers and turn the stirrers on to the same speed. It is important to use a common container to ensure both beakers have the same initial temperature. If using food coloring, add a few drops to each beaker. 4. Insert the temperature sensors and either start the computer acquisition program to record the initial beaker temperatures or manually record the initial temperatures. 5. Weigh approximately 40 grams of crushed ice into each of two small trays. Make sure both trays contain the same mass of ice. Take one of the ice samples and form it into a tight “snowball”. Minimize any water (melted ice) in your ice samples by using fresh ice. 6. Add the loose crushed ice to one of the agitated water beakers and the “snowball” to the other agitated beaker. Immediately begin recording temperature as a function of time with the data acquisition equipment (e.g. Vernier Labpro) until the temperatures of both beakers have stopped changing. Note the initial rate of cooling and the final temperature for each system.Analysis: Please answer each of the questions below. This analysis may be conducted outside of class orlaboratory for homework. While you should submit individual solutions, you are strongly encouraged toarrive at answers through discussion with laboratory partners or classmates. Remember to put your nameor identifying student number on each page of your response.1. Compare your initial predictions to what actually happened. Were your predictions correct?2. If the experimental results do not match your initial predictions, come up with a new explanation of the results. In your explanations, you should pay particular attention to why your original predictions were not correct and how you had to revise your thinking to explain what happened. You should discuss your answers with at least 2 other students and agree on what happened and why.3. Again working with others, write out the mathematical equations which should govern (1) the rate and (2) the amount of cooling provided by the ice. If necessary, consult your textbook or other sources. Compare your experimental data to your mathematical models and make sure that your model and results agree or that you can explain any discrepancies.4. Again working with others, answer the following related questions: a. Do factors which increase the rate of heat transfer always increase the amount of heat transfer too? b. Can we generalize the answer to that question to other processes such as mass transfer? For example, do factors which increase the rate at which a sugar cube dissolves in water (such as stirring) also increase the final amount of sugar dissolved in water at equilibrium?5. What, if anything, did you learn in this activity?Appendix 2. Laboratory Set-Up for Packet 1Activity 1: Cooling Beverages with IceLaboratory Set-upThis experiment was done at Bucknell using Vernier Labpro software tocollect and display temperature vs. time data. The same data can be collectedwith thermocouples or thermometers and a timepiece, and then plotted. Aphoto of the set-up is shown to the right and various Labpro screen displaysare shown below as part of the technical analysis.Note: In some experiments, ice melt trapped in the snow ball caused the packed ice to provide lesscooling (since the same mass of ice-cold water does not provide the same cooling as ice due to the heat offusion). Be sure to use fresh chipped ice or to make efforts to avoid trapping ice melt (liquid water) inyour samples.

Prince, M. J. (2012, June), Best Paper PIC IV: The Use of Inquiry-based Activities to Repair Student Misconceptions Related to Heat, Energy, and Temperature Paper presented at 2012 ASEE Annual Conference & Exposition, San Antonio, Texas.

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2012 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015