Asee peer logo

Work in Progress: Enhancing Undergraduate Biomedical Engineering Laboratory Reports through Information and Data Literacy Instruction

Download Paper |

Conference

2022 ASEE Annual Conference & Exposition

Location

Minneapolis, MN

Publication Date

August 23, 2022

Start Date

June 26, 2022

End Date

June 29, 2022

Conference Session

Biomedical Engineering Division: Best of Works in Progress

Page Count

6

DOI

10.18260/1-2--41474

Permanent URL

https://peer.asee.org/41474

Download Count

252

Paper Authors

biography

Alexander Carroll Vanderbilt University

visit author page

Alex Carroll, MSLS, AHIP, is the Librarian for STEM Research at the Vanderbilt University Libraries. Alex serves as a liaison librarian for the School of Engineering and STEM academic units within the College of Arts and Science, supporting the research of faculty and developing curriculum-integrated information literacy instruction programs for students in the sciences.

Alex is the Interim Editor-in-Chief of the Journal of the Medical Library Association (JMLA) and is a Senior member of MLA's Academy of Health Information Professionals (AHIP). His research interests include studying the information seeking behaviors and data practices of STEM researchers, mentoring emerging LIS professionals, and improving information literacy instruction for students in the sciences. He has published on these topics in journals such as College & Research Libraries, portal: Libraries and the academy, The Journal of Academic Librarianship, and the Journal of the Medical Library Association. His work in these areas has been recognized by the ALA Library Instruction Round Table with "Top Twenty" awards in 2018 and 2019.

Previously, Alex was the Lead Librarian for Research Engagement and the Research Librarian for Engineering and Biotechnology at the NC State University Libraries. Prior to joining NC State, Alex was the Agriculture and Natural Resources Librarian at the University of Maryland. He received his MSLS degree from the University of North Carolina at Chapel Hill’s School of Information and Library Science, and his BA from James Madison University.

visit author page

biography

Joshua Borycz Vanderbilt University

visit author page

At Vanderbilt University I help graduate and undergraduate students learn how to do research and succeed academically by introducing them to a range of tools, developing new tools, creating educational programs, and advocating for the use of library services. My goal is to help connect researchers to the tools and insights that can help them to integrate good data management practices and data sharing tools to improve scientific collaboration. I became interested in Library and Information science after my PhD in chemistry and decided to pursue a Master's Degree at the University of Tennessee, Knoxville. This degree connected me with many opportunities to act as an advocate for integrating library services into modern scientific research.

I was a computational chemist at the University of Minnesota, Twin Cities whose research has focused on performing quantum mechanical calculations on the utility of metal-organic frameworks for applications involving magnetism, carbon dioxide capture, and catalysis. My interest in fundamental research stemmed from my desire to gain a deeper understanding of processes used in industrial and energy generating applications. The computational nature of my research provides me a strong understanding of the theory behind these processes and has allowed me provide insight to and learn from experimental chemists and chemical engineers.

visit author page

author page

francisco.d.juarez@vanderbilt.edu Juarez Vanderbilt University Library

author page

Amanda Lowery Vanderbilt University

Download Paper |

Abstract

Motivation Undergraduate engineering programs seek to train students in the process skills of engineering, which include designing hypotheses, identifying and synthesizing relevant literature, interpreting and analyzing data, and presenting findings [1]. While engineering educators routinely report that engineering process skills are critical, many report difficulty teaching students these skills due to time constraints [2]. Librarians, who specialize in the organization of information and data, are well-equipped to help biomedical engineering (BME) educators address some of these gaps in their students’ learning [3]. This project sought to determine whether integrating a specialized information literacy curriculum into a BME laboratory course sequence could improve students’ ability to find and utilize technical information within their laboratory reports. Background on Problem Being Addressed Instructional partnerships between academic libraries and biomedical engineering educators often unfold within upper-level design sequences, as the inquiry-based design projects within these courses offer opportunities for meaningful information literacy instruction [4]. However, many BME students encounter assignments earlier in their coursework that require them to develop hypotheses, interpret their own data, and synthesize scientific literature to advance an argument. When faced with these assignments, many students benefit from meeting with a librarian, who can demonstrate how to access a variety of specialized technical information and data sources (e.g., research articles, experimental protocols, handbooks, etc.) that are directly relevant to the students’ specific assignments [5]. Moreover, librarians trained in research data management can also provide insights into best practices related to organizing data collected in a laboratory [6]. Instructional Methods The BME laboratory course at Vanderbilt University recently transformed from a single three credit lab course offered in the senior year to three separate one credit courses offered in the sophomore, junior, and senior years. Over the course of these three one credit courses, students are expected to grow their ability to conduct laboratory experiments, collect data, interpret their findings, and write effective laboratory reports by placing their findings in context with previously published research in the topic. To enable students to write their laboratory reports more effectively, a librarian provides a guest lecture in each of the sections, introducing different types of information sources and demonstrating how to access relevant library licensed materials. In the sophomore course (BME 2900W), a librarian demonstrates how to find published experimental protocols to help write methods sections. In the junior course (BME 3900W), a librarian demonstrates how to find engineering handbooks and review articles to write more effective introduction sections. In the senior course (BME 4901W), a librarian discusses best practices related to data management, including how to organize files and design machine-readable tabular data. This training program is designed to complement the training students receive in their senior design course (BME 4950), which focuses on introducing additional information sources focused on commercialization topics, such as intellectual property, total adjustable markets, competitive landscapes, regulatory pathways, as well as device billing and reimbursement strategies. Assessment Beginning in the spring 2022 semester, students began to participate in a longitudinal pre-test / post-test assessment that aims to measure the efficacy of this information literacy program. Students complete a pre-test prior to the BME 2900W intervention that establishes their baseline knowledge; they will also complete similar pre-tests prior to the BME 3900W and BME 4901W interventions to gauge their retention over time. The pre-tests and post-test, which include a mix of objective and open response questions, can be viewed online on the Open Science Framework [7]. Following the BME 4901W intervention, students will complete a post-test; pre-tests and post-tests scores will be compared to assess knowledge gains, using statistical tests to check for significance across the groups’ performance. In addition to the pre-test / post-test study, we will use a mixed methods approach to perform an authentic assessment of a random sample of students’ laboratory reports. Using rubrics to study artifacts of student learning, often called authentic assessment, is the gold standard method established within the literature [8]–[11]. To assess whether these interventions impacted the information sources students consulted when writing their laboratory reports, we customized a rubric that had been used previously to evaluate undergraduate life sciences students use of information when completed research assignments [12]. These rubrics will be applied to students’ laboratory reports measuring student achievement of three learning outcomes related to information literacy. The complete rubric, along with the three learning objectives, can be viewed online on the Open Science Framework [13]. We also will perform a citation analysis of students’ laboratory reports to measure the extent of their information use as reflected by their inclusion of reference lists and use of internal citations. These quantitative citation analyses will report the mean, standard deviation (SD), and range of sources cited within students’ lab reports. To isolate the effects of this information literacy training program, for both the rubric and citation analysis portions of this study, lab reports generated by students following this information literacy training will be compared to a sample of laboratory reports created by previous cohorts of Vanderbilt BME students who did not receive any specialized information literacy training within their laboratory course. Anticipated Results Data collection began in the spring 2022 semester and will be ongoing for the next several semesters. Table 1 shows benchmarking data taken from the pre-tests completed by the first cohort of BME 2900W students. We plan to share additional preliminary data at the 2022 Annual Meeting, including qualitative analysis of the open response questions from the pre-tests. We anticipate that after completion of BME 4901W, students will demonstrate an increased understanding of the breadth of technical information sources available. We also anticipate that the laboratory reports completed by students who received this intervention will show improvement across all three learning outcomes as well as increased utilization of specialized resources such as experimental protocols and engineering handbooks as signified by internal citations and lists of references. Table 1: Benchmarking student performance on BME2900W pre-tests Question Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Percent Correct 84.6 84.6 87.2 66.7 53.8 71.8 71.8 25.6 30.8 20.5

Limitations This study presents data from undergraduate students within a single department at a single institution, which may limit the generalizability of any potential findings. While this study utilizes a mixed methods approach to study students’ information seeking behaviors and resource evaluation skills, limitations exist for each of these methods. Students will complete multiple pre-tests to measure their retention of knowledge and skills in between training sessions; however, completing multiple pre-tests can create practice effects that affect performance, which may bias our data [14]. Although rubrics are regarded as gold standard evidence for assessing student research assignments, critics suggest that rubrics can be uneven in terms of reliability, accuracy, and validity [15]. Likewise, while citation analyses of students’ reference lists are a well-utilized method for assessing student information use [16], researchers have suggested that this approach has limits [17]. In particular, others have noted that number of sources students choose to cite may reflect assignment requirements rather than a careful and critical selection of authoritative sources [18]. Finally, this study cannot account for additional information literacy training students may have experienced prior to their enrollment in the study (i.e., within a first-year composition course) or training they might receive simultaneously within other courses outside of their biomedical engineering curriculum. These students may possess higher baseline skills or show increased improvement over their peers, creating noise within our data. Conclusions Despite these limitations, we believe this information literacy training program has the potential to expand our understanding of whether integrating science process skills instruction into laboratory courses over the course of several semesters can improve students’ ability to find, evaluate, select, and synthesize evidence for use in writing. We look forward to discussing this program’s design and proposed assessment methods with colleagues at the 2022 Annual Meeting, and would welcome feedback from members of the community on possible improvements we can make prior to the fall 2022 semester. References [1] J. Handelsman et al., “Scientific Teaching,” Science, vol. 304, no. 5670, pp. 521–522, Apr. 2004, doi: 10.1126/science.1096022. [2] D. Coil, M. P. Wenderoth, M. Cunningham, and C. Dirks, “Teaching the Process of Science: Faculty Perceptions and an Effective Methodology,” LSE, vol. 9, no. 4, pp. 524–535, Dec. 2010, doi: 10.1187/cbe.10-01-0005. [3] A. J. Carroll, “Thinking and Reading like a Scientist: Librarians as Facilitators of Primary Literature Literacy,” Medical Reference Services Quarterly, vol. 39, no. 3, pp. 295–307, Jul. 2020, doi: 10.1080/02763869.2020.1778336. [4] A. J. Carroll, S. J. Hallman, K. A. Umstead, J. McCall, and A. J. DiMeo, “Using information literacy to teach medical entrepreneurship and health care economics,” Journal of the Medical Library Association, vol. 107, no. 2, pp. 163–171, Apr. 2019, doi: 10.5195/jmla.2019.577. [5] K. M. Klipfel, “Authentic engagement: Assessing the effects of authenticity on student engagement and information literacy in academic library instruction,” Reference Services Review, vol. 42, no. 2, pp. 229–245, Jun. 2014, doi: 10.1108/RSR-08-2013-0043. [6] J. Borycz, “Implementing Data Management Workflows in Research Groups Through Integrated Library Consultancy,” Data Science Journal, vol. 20, no. 1, Art. no. 1, Feb. 2021, doi: 10.5334/dsj-2021-009. [7] A. J. Carroll and J. Borycz, “BME 2900 Information Literacy Learning Assessment,” Feb. 2022, doi: 10.17605/OSF.IO/VT7YC. [8] S. F. Phelps and K. R. Diller, “Learning outcomes, portfolios, and rubrics, oh my! Authentic assessment of an information literacy program,” portal: Libraries and the Academy, vol. 8, no. 1, pp. 75–89, 2008. [9] M. Oakleaf, “Using rubrics to assess information literacy: an examination of methodology and interrater reliability,” J. Am. Soc. Inf. Sci., vol. 60, no. 5, pp. 969–983, May 2009, doi: 10.1002/asi.21030. [10] M. Oakleaf, “Are They Learning? Are We? Learning Outcomes and the Academic Library,” The Library Quarterly, vol. 81, no. 1, pp. 61–82, Jan. 2011, doi: 10.1086/657444. [11] J. Belanger, N. Zou, J. R. Mills, C. Holmes, and M. Oakleaf, “Project RAILS: Lessons Learned about Rubric Assessment of Information Literacy Skills,” portal: Libraries and the Academy, vol. 15, no. 4, pp. 623–644, 2015. [12] A. J. Carroll, N. Tchangalova, and E. G. Harrington, “Flipping one-shot library instruction: using Canvas and Pecha Kucha for peer teaching,” J Med Libr Assoc, vol. 104, no. 2, pp. 125–130, Apr. 2016, doi: 10.3163/1536-5050.104.2.006. [13] A. J. Carroll and J. Borycz, “BME Laboratory Report Rubric,” Mar. 2022, Accessed: May 11, 2022. [Online]. Available: https://osf.io/x85ct/ [14] M. Calamia, K. Markon, and D. Tranel, “Scoring Higher the Second Time Around: Meta-Analyses of Practice Effects in Neuropsychological Assessment,” The Clinical Neuropsychologist, vol. 26, no. 4, pp. 543–570, May 2012, doi: 10.1080/13854046.2012.680913. [15] A. R. Rezaei and M. Lovorn, “Reliability and validity of rubrics for assessment through writing,” Assessing Writing, vol. 15, no. 1, pp. 18–39, Jan. 2010, doi: 10.1016/j.asw.2010.01.003. [16] C. C. Barratt, K. Nielsen, C. Desmet, and R. Balthazor, “Collaboration is key: Librarians and composition instructors analyze student research and writing,” portal: Libraries and the Academy, vol. 9, no. 1, pp. 37–56, 2009, doi: 10.1353/pla.0.0038. [17] A. M. Robinson and K. Schlegl, “Student Bibliographies Improve When Professors Provide Enforceable Guidelines for Citations,” portal: Libraries and the Academy, vol. 4, no. 2, pp. 275–290, Apr. 2004, doi: 10.1353/pla.2004.0035. [18] S. Hurst and J. Leonard, “Garbage in, garbage out: the effect of library instruction on the quality of students’ term papers,” Electronic Journal of Academic & Special Librarianship, vol. 8, no. 1, 2007.

Carroll, A., & Borycz, J., & Juarez, F., & Lowery, A. (2022, August), Work in Progress: Enhancing Undergraduate Biomedical Engineering Laboratory Reports through Information and Data Literacy Instruction Paper presented at 2022 ASEE Annual Conference & Exposition, Minneapolis, MN. 10.18260/1-2--41474

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2022 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015