A citation index is an index of citations between publications, allowing the user to easily discern which later documents cite which earlier documents.
The first citation indices were legal citators such as Shepard's Citations (1873). In 1960, Eugene Garfield's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, starting with the Science Citation Index (SCI), and later expanding to produce the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). As of 2006, there are other sources of such data, such as Google Scholar.
Citations are used as a measure of importance or relative value of the information source, such as an individual journal article, book, and others. For example, if an article is frequently cited by other journal articles and books in the discipline, it may indicate the relative importance of a work. Citation impact analysis is called bibliometrics in library and information science and has a wide range of applications.
A citation is the act of acknowledging or citing the author, year, title, and locus of publication (journal, book, or other) of a source used in a published work. Such citations can be counted as measures of the usage and impact of the cited work. This is called citation analysis or bibliometrics (see below). Among the measures that have emerged from citation analysis are the citation count for
Citation counts are correlated with other measures of scholarly/scientific performance and impact and can in some cases be enhanced by making a work open access by self-archiving the complete article on the web, publishing it in an open access journal, or publishing it as an Open access article in one of the Hybrid open access journals.
There also exists an H-index measure of an individual scientist's impact and citation record.
There are two publishers of general-purpose academic citation indexes, available to libraries by subscription:
There are a number of other indexes, more readily available. Some of the currently notable ones are:
Each of these products offer an index of citations between publications and a mechanism to establish which documents cite which other documents. The different products offer different ways to access the citation list and also display their citation index differently. They differ widely in cost: WOK and Scopus are among the highest-cost subscription databases; the others mentioned are free.
While citation indexes were originally designed for information retrieval purposes, they are increasingly used for bibliometrics and other studies involving research evaluation. Citation data is also the basis of the popular journal impact factor. There is large body of literature on citation analysis, sometimes called scientometrics, a term invented by Vasily Nalimov, or more specifically bibliometrics. The field blossomed with the advent of the Science Citation Index, which now covers source literature from 1900 on. The leading journals of the field are Scientometrics and the Journal of the American Society of Information Science and Technology. ASIST also hosts an electronic mailing list called SIGMETRICS at ASIST.. This method is undergoing a resurgence based on the wide dissemination of the Web of Science and Scopus subscription databases in many universities, and the universally-available free citation tools such as CiteBase, CiteSeer, Google Scholar, and Windows Live Academic.
Bibliometrics is a set of methods used to study or measure texts and information. Citation analysis and content analysis are commonly used bibliometric methods. While bibliometric methods are most often used in the field of library and information science, bibliometrics have wide applications in other areas. In fact, many research fields use bibliometric methods to explore the impact of their field, a set of researchers, or a particular paper.
Historically bibliometric methods have been used to trace relationships amongst academic journal citations. Citation analysis, which involves examining an item's referring documents, is used in searching for materials and analyzing their merit. Citation indices, such as Institute for Scientific Information's Web of Science, allow users to search forward in time from a known article to more recent publications which cite the known item.
Data from citation indexes can be analyzed to determine the popularity and impact of specific articles, authors, and publications. Using citation analysis to gauge the importance of one's work, for example, is a significant part of the tenure review process. Information scientists also use citation analysis to quantitatively assess the core journal titles and watershed publications in particular disciplines; interrelationships between authors from different institutions and schools of thought; and related data about the sociology of academia. Some more pragmatic applications of this information includes the planning of retrospective bibliographies, "giving some indication both of the age of material used in a discipline, and of the extent to which more recent publications supersede the older ones;" indicating through high frequency of citation which documents should be archived; comparing the coverage of secondary services which can help publishers gauge their achievements and competition, and can aid librarians in evaluating "the effectiveness of their stock." There are also some limitations to the value of citation data. They are often incomplete or biased; data has been largely collected by hand (which is expensive), though citation indexes can also be used; incorrect citing of sources occurs continually; thus, further investigation is required to truly understand the rationale behind citing to allow it to be confidently applied.
Although citation analysis is nothing new (the Science Citation Index began publication in 1961), it was all done manually and thus really couldn't scale. Automated algorithms are making it much more useful, versatile, and widespread. Google's PageRank is based on the principle of citation analysis. Patent citation maps are also based upon citation analysis (in this case, the citation of one patent by another).
Other bibliometrics applications include: creating thesauri; measuring term frequencies; exploring grammatical and syntactical structures of texts; measuring usage by readers.
In 2003 Charles Murray published the results of a vast bibliometric study supposed to reveal the "significant figures" in the arts and sciences. Some 4002 people are ranked in his lists compiled for 12 domains (8 scientific disciplines, literature, philosophy, arts).
In a classic 1965 paper, Derek J. de Solla Price described the inherent linking characteristic of the SCI as "Networks of Scientific Papers." The links between citing and cited papers became dynamic when the SCI began to be published online. The Social Sciences Citation Index became one of the first databases to be mounted on the Dialog system in 1972. With the advent of the CD-ROM edition, linking became even easier and enabled the use of bibliographic coupling (M. M. Kessler) for finding related records. In 1973, Henry Small published his classic work on Co-Citation analysis which became a self-organizing classification system that led to document clustering experiments and eventually an Atlas of Science later called Research Reviews.
ISI also published Current Contents, a paper publication reproducing journal title pages, widely used at the time for keeping up with the current literature, a technique known as selective dissemination of information (SDI), periodic updates of literature searches based on user profiles. The combination with SCI permitted the first use in 1965 of earlier cited references as a factor in the selection, in a product called Automatic Subject Citation Alert. This continues in electronic form as the ISI Personal Alert; this feature is now almost universally available in any bibliometric database and for most electronic journals. In the case of SCI/SSCI profiles contained not only traditional natural language search terms, but also terms for cited references and cited authors, though this too is now a part of most such systems. Thus, a user can be alerted to any new works which cited the author, paper or book in question. Using journal names in a similar way, customized contents pages could also be provided. The inherent topological and graphical nature of the worldwide citation network which is an inherent property of the scientific literature was described by Ralph Garner at Drexel University in 1965. The use of citation counts to rank journals was a technique used in the early part of the nineteenth century but the systematic ongoing measurement of ths counts for scientific journals was initiated by Eugene Garfield at the Institute for Scientific Information who also pioneered the use of these counts to rank authors and papers. In a landmark paper of 1965, he and Irving Sher showed the correlation between citation frequency and eminence in demonstrating that Nobel Prize winners published five times the average number of papers while their work was cited 30 to 50 times the average. In a long series of essays on the Nobel and other prizes Garfield reported this phenomenon. The usual summary measure is known as impact factor, the number of citations to a journal for the previous two years, divided by the number of articles published in those years. It is widely used, both for appropriate and inappropriate purposes—in particular, the use of this measure alone for ranking authors and papers is quite controversial.
In an early study in 1964 of the use of Citation Analysis in writing the history of DNA, Garfield and Sher demonstrated the potential for generating historiographs, topological maps of the most important steps in the history of scientific topics. This work was later automated by E. Garfield, A.I. Pudovkin of the Institute of Marine Biology, Russian Academy of Sciences and V. S. Istomin of Center for Teaching, Learning, and Technology, Washington State University and led to the creation of the HistCite software around 2002. Autonomous citation indexing was introduced in 1998 by Giles, Lawrence and Bollacker and enabled automatic algorithmic extraction and grouping of citations for any digital academic and scientific document. Where previous citation extraction was a manual process, citation measures could now be computed for any scholarly and scientific field and document venue, not just those selected by organizations such as ISI. This led to the creation of new systems for public and automated citation indexing, the first being CiteSeer, soon followed by Cora (recently reborn as Rexa), which focused primarily on the field of computer and information science. These were later followed by large scale academic domain citation systems such as the Google Scholar and more recently Microsoft Academic. Such autonomous citation indexing is not yet perfect in citation extraction or citation clustering with an error rate estimated by some at 10%. This has resulted in such authors as Ann Arbor, Milton Keynes, and Walton Hall being credited with extensive academic output. It should be noted that SCI has also been prepared through purely programmatic methods, and the older record particularly have a similar magnitude of error.
All links retrieved February 23, 2017.
New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:
Note: Some restrictions may apply to use of individual images which are separately licensed.