Asee peer logo

Faculty Publication Checklists: A Quantitative Method to Compare Traditional Databases to Web Search Engines

Download Paper |

Conference

2012 ASEE Annual Conference & Exposition

Location

San Antonio, Texas

Publication Date

June 10, 2012

Start Date

June 10, 2012

End Date

June 13, 2012

ISSN

2153-5965

Conference Session

Considerations for the Collection Conscious Librarian

Tagged Division

Engineering Libraries

Page Count

13

Page Numbers

25.634.1 - 25.634.13

DOI

10.18260/1-2--21391

Permanent URL

https://peer.asee.org/21391

Download Count

328

Request a correction

Paper Authors

biography

Patricia E. Kirkwood University of Arkansas

visit author page

Patricia Kirkwood is the Engineering and Mathematics Librarian serving more than 3,000 students in these disciplines. She has provided reference and instruction services in every STEM field. Her interest in finding appropriate ways to evaluate resources and services has resulted in studies using citation analysis, use information, interlibrary loan statistics, and publication patterns.

visit author page

Download Paper |

Abstract

ASEE 2012 conference abstractFaculty publication checklists: A quantitative method to compare traditional databases to web searchengines.As specialized webcrawlers like CiteSeerX and Quertle become more sophisticated, they gain recognitionin user communities. It is arguable that these tools may do a better job of gathering references toconference and grey literature than do traditional databases. Librarians need to compare thesedissimilar products as they make collection development decisions and provide instruction. Thetraditional method used to compare databases is based on journal publication lists. These checklists areused to determine depth of content and overlap. Journal title checklists are not applicable whenevaluating web search engines. First, there is no publication list for web crawlers. Second, forengineering related search tools, journal title checklists do not include conference publications,government documents, reports, standards, patents and grey literature. One method used to assessweb crawlers is to compare results of subject based searches to the same search in a traditionaldatabase. Reviewing the number of items retrieved and the appropriateness of a subset of the citationsretrieved can provide a qualitative assessment of how well a search engines performs. However thismethod does not show the depth and breadth of content found. A quantitative method of assessing thecontent gathered by web crawlers to the content provided by the traditional database publisher will beexplored in this paper. A checklist is developed using author publications lists. The checklist isevaluated and validated during a project that compares the content of a computer science specific webcrawler to traditional computer science databases. The paper will also explore the benefits andlimitations of using local faculty CVs to develop the checklist.

Kirkwood, P. E. (2012, June), Faculty Publication Checklists: A Quantitative Method to Compare Traditional Databases to Web Search Engines Paper presented at 2012 ASEE Annual Conference & Exposition, San Antonio, Texas. 10.18260/1-2--21391

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2012 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015