Louisville, Kentucky
June 20, 2010
June 20, 2010
June 23, 2010
2153-5965
Engineering Libraries
17
15.278.1 - 15.278.17
10.18260/1-2--16508
https://peer.asee.org/16508
736
Dana Denick is a Mastes Degree candidate in Library and Information Science at Drexel
University. She is currently the Assistant Librarian for Science and Engineering at W.W. Hagerty Library. Dana holds a BS in mechanical engineering from Bucknell University and an MA in physics education from the University of Virginia.
Jay Bhatt, MSEE, MLIS is the Information Services Librarian (Engineering) at Drexel University. He received IEEE's mentorship award and a Certificate of Appreciation in recognition of outstanding leadership as the Drexel University IEEE Graduate Students Forum Partnership Coordinator and Student Branch Liaison 2006-2007. He is the 2003 recipient of Drexel University's Harold Myers Distinguished Service Award. He is actively involved with the Engineering Libraries Division of the ASEE.
Bradley Layton is an Associate Professor in the Mechanical Engineering and Mechanics Department at Drexel University where he investigates the mechanical properties of cells and proteins while leading several engineering design courses. He received his BS from MIT in mechanical engineering. He holds an MS in mechanical engineering and a PhD in biomedical engineering from the University of Michigan. He is a member of the ASME, BME, and an Associate Editor for the IEEE-EMBC.
Citation Analysis of Engineering Design Reports for Information Literacy Assessment Abstract
The application of information literacy standards and assessment in higher education are gaining importance in high-stakes decision making and accreditation. Therefore, those responsible for information literacy instruction must apply ongoing, multiple forms of assessment to effectively evaluate student proficiencies. This study explores the assessment of first-year engineering design students’ information literacy skills in order to refine existing methods and library instructional strategies. A citation analysis is presented, representing references cited in first- year engineering design reports from Drexel University’s Introduction to Engineering Design program during the 2008-2009 academic year. Citation style was evaluated and the quantity, resource type, and currency of each citation were recorded. From a sample of 234 citations, 38% of references were classified as websites, 28% of references were journal articles and 12% of references were books. Similarly to other studies, students showed a marked preference for obtaining background information through web searching over the use of reference books in either print or electronic format. The results of this study were compared to previous assessment efforts and aligned to the ALA/ACRL/STS Task Force on Information Literacy for Science and Technology’s Information Literacy Standards for Science and Engineering/Technology1. The methods and findings of this study demonstrate an evidence-based approach, focusing on standards-based assessment of engineering information literacy, specifically in how to best serve students new to engineering research, design and communication. We conclude that a quantitative approach enabled by trained engineering librarians working in tandem with engineering design instructors is critical to enhancing the breadth and rigor by which engineering design students reference their work. We further assert that methods described herein be considered as an additional criterion for ABET accreditation.
Literature Review
A variety of information literacy assessment techniques have been developed to meet the growing demands of accountability in library instruction. Indirect assessment strategies such as interviews, focus groups and surveys have been used by some institutions to gain practical insights into student research behavior. Typically, since library instruction occurs in a “one- shot” class session, librarians often employ some form of direct assessment, mainly selected- response (multiple choice, fill-in-the-blank, or true/false) assessments focusing on library skills, the appropriate and ethical use of information, student perceptions regarding library resources and research self-efficacy. A review of the literature shows a wide range of case studies examining measured results and extrapolating the implications of such assessment. This type of summative, selected-response assessment can provide some indication of whether information literacy standards are being met. However to fully evaluate student proficiencies, information literacy assessment must also be able to assess a student’s ability to apply information literacy
Denick, D., & Bhatt, J., & Layton, B. (2010, June), Citation Analysis Of Engineering Design Reports For Information Literacy Assessment Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16508
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015