Tokyo Keynote Address Abstract

Evaluation of Citation-enhanced Scholarly Databases

Evaluation and comparison of databases by the information professionals are critical in selecting the most appropriate database(s) for the task at hand. The inclusion of the references cited by the source documents adds a new dimension not only to searching for closely related papers through bi-directional citation tracing but also to evaluating databases. The appropriately presented cited references can lead the users to topically related items without the need for using controlled vocabulary and complex search statements. Many of the traditional evaluation criteria apply also to the cited references, such as breadth of coverage, time span, source coverage, journal base, accuracy, and consistency, but there are additional content features specific to cited references, such as their structuring and formatting.
The content criteria are always to be evaluated along with the browsing, searching and output features of the software. It is the indexing, storing and displaying of the cited references where the information storage and retrieval programs differ the most. The smartest programs provide additional, citation-specific software features, such as the intra-database and inter-database linking to the cited and citing sources, the calculation and display of citedness score of the cited and citing documents and the ranking of the search results by citedness score. These features define the efficiency of both the cited reference searching, and also the traditional subject searching.
The keynote address reviews the major database evaluation criteria with special emphasis on citation-related content and software features which can make a significant difference in empowering the information professionals.


Tokyo Workshop Abstract

How to compare and evaluate the citation-related content and software features of databases

The workshop illustrates the methods for evaluating and comparing the extent, content, format, and structure of cited references in some databases, publishers’ archives and repositories. The workshop also demonstrates the differences in a) reporting the citedness factor, b) looking up indexes of and c) searching for cited authors, institutions, journal names, and document titles on different software platforms (Web of Knowledge, Scopus, Google Scholar, STN, CSA, Dialog, Ovid, OCLC, Ebsco, ScienceDirect, HighWire Press, and CiteSeer).



Back