Impact factor

From New World Encyclopedia
Revision as of 20:22, 17 December 2008 by Dan Voltz (talk | contribs)
Copyediting in Process!

Impact factor, often abbreviated IF, is a measure of citations in science and social science journals. It is frequently used as a proxy for the importance of a journal to its field. Impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information, now part of Thomson, a large worldwide U.S.-based publisher. Impact factors are calculated each year by Thomson Scientific for those journals which it indexes, and the factors and indices are published in Journal Citation Reports.

It is, however, possible to measure the impact factor of the journals in which a particular person has published articles. This use is widespread, but controversial. Eugene Garfield warns about the "misuse in evaluating individuals" because there is "a wide variation from article to article within a single journal."[1]

Impact factors have a huge, but controversial, influence on the way published scientific research is perceived and evaluated, including: Evaluation of the productivity of scientific research groups and projects by administrators and funding sources; evaluation of Universities and their departments; selection of journals in library acquisition. Although, impact factor is applied only to journals, it is used to evaluate individual scientist. If a scientist published articles on journals with high impact factor, he or she is evaluated as "important." Some journals manipulate the calculation mechanism to receive a "high" impact factor.

Measurement and calculation

Measurement

Values calculated and published are as follows. Vales are calculated by the Institute for Scientific Information for those journals which it indexes; both impact factors and immediacy indices are published annually in the Journal Citation Reports:

  • Impact Factor: A general measure of the citation impact; calculated in a two-year period.
  • Immediacy Index: The number of citations the articles in a journal receive in a given year divided by the number of articles published. An immediacy index is a measure of how topical and urgent work published in a scientific journal is.
  • Cited Half-life: The median age of the articles that were cited in Journal Citation Reports each year. For example, if a journal's half-life in 2005 is 5, that means the citations from 2001-2005 are half of all the citations from that journal in 2005, and the other half of the citations precede 2001.[2]
  • Aggregate Impact Factor for a subject category: It is calculated taking into account the number of citations to all journals in the subject category and the number of articles from all the journals in the subject category.

These measures apply only to journals, not individual articles or individual scientists (unlike, for example, the H-index). The relative number of citations an individual article receives is viewed as citation impact.

Calculation

The impact factor of a journal is calculated based on a two-year period. It can be viewed as the average number of citations in a year given to those papers in a journal that were published during the two preceding years.[3] For example, the 2003 impact factor of a journal would be calculated as follows:

A = the number of times articles published in 2001-2 were cited in indexed journals during 2003
B = the number of "citable items" (usually articles, reviews, proceedings or notes; not editorials and letters-to-the-Editor) published in 2001-2
2003 impact factor = A/B
(note that the 2003 impact factor was actually published in 2004, because it could not be calculated until all of the 2003 publications had been received.)

A convenient way of thinking about it is that a journal that is cited once, on average, for each article published has an IF of 1 in the expression above.

There are some nuances to this: ISI excludes certain article types (so-called "front-matter" such as news items, correspondence, and errata) from the denominator. Thomson Scientific does not have a fixed rule for which types of articles are considered "citable" and which front-matter.[4]

New journals, that are indexed from their first published issue, will receive an impact factor after the completion of two years of indexing; in this case, the citations to the year prior to Volume 1, and the number of articles published in the year prior to Volume 1 are known zero values. Journals that are indexed starting with a volume other than the first volume will not have an impact factor published until three complete data-years are known; annuals and other irregular publications, will sometimes publish no items in a particular year, affecting the count. The impact factor is for a specific time period; it is possible to calculate the impact factor for any desired period, for which the web site gives instructions. Journal Citation Reports includes a table of the relative rank of journals by Impact factor, in each specific science discipline, such as organic chemistry or psychiatry.

Debate

It is sometimes useful to be able to compare different journals and research groups. For example, a sponsor of scientific research might wish to compare the results to assess the productivity of its projects. An objective measure of the importance of different publications is then required. The number of publications and citation statistics are two obvious candidates for such an objective measure. However, the use of such measures in general and the impact factor in particular is still a matter of debate.

Favorable properties

  • Thomsons Scientific's wide international coverage. Web of Knowledge indexes 9000 science and social science journals from 60 countries. This is perhaps only partially correct:
  • Results are widely (though not freely) available
  • It is an objective measure (see Debate below)
  • In practice, the alternative measure of quality is "prestige." This is rating by reputation, which is very slow to change, and cannot be quantified or objectively used. It merely demonstrates popularity.

Thomsons Scientific describes the values of Journal Citation Reports as follows:

  • Librarians can support evaluate and document the value of their library research investments.
  • Publishers and editors can determine journals' influence in the marketplace, to review editorial policies and strategic direction, monitor competitors, and identify new opportunities.
  • Authors can identify the most appropriate, influential journals in which to publish.
  • Professors and students can discover where to find the current reading list in their respective fields.
  • Information analysts can track bibliometric and citation trends and patterns.[5]

Objections

In the continuing controversy, numerous criticisms have been made of the use of impact factor. Besides the more general debate on the usefulness of citation metrics, criticisms mainly concern the validity of the impact factor, how easily manipulated it is and its misuse.

Validity

  • The denominator of the impact factor is negotiable and therefore does not reflect actual citation counts.[4]
  • The impact factor could not be reproduced in an independent audit.[6]
  • The impact factor refers to the average number of citations per paper, and this is not a gaussian distribution. It is rather a Bradford distribution, as predicted by theory. The impact factor is therefore not a valid measure for citation evaluation.[7]
  • The temporal window for citation is too short. Classic articles are cited frequently even after several decades, but this should not affect specific journals.[8]
  • In the short term—especially in the case of low-impact-factor journals—many of the citations to a certain article are made in papers written by the author(s) of the original article.[9] This means that counting citations may be independent of the real “impact” of the work among investigators.
  • Failure to include more international journals or other types of publications. Although Web of Knowledge indexes journals from 60 countries, the coverage is very uneven. Very few publications from languages other than English are included, and very few journals from the less-developed countries. Even the ones that are included are undercounted, because most of the citations to such journals will come from other journals in the same language or from the same country, most of which are not included. Many high quality journals in the applied aspects of some subjects are not included, such as marketing communications, public relations and promotion management and many important but not peer-reviewed technical magazines. Book publications are not indexed, including textbooks, handbooks and reference books. Conference proceedings publications are not indexed, including conferences, workshops, and symposia.

Manipulation

A journal can adopt editorial policies that increase its impact factor.[10] These editorial policies may not solely involve improving the quality of published scientific work.

  • Journals sometimes may publish a larger percentage of review articles. While many research articles remain uncited after 3 years, nearly all review articles receive at least one citation within three years of publication, therefore review articles can raise the impact factor of the journal. The Thomson Scientific website gives directions for removing these journals from the calculation. For researchers or students having even a slight familiarity with the field, the review journals will be obvious.
  • Journals may change the fraction of "citable items" compared to front-matter in the denominator of the IF equation. Which types of articles are considered "citable" is largely a matter of negotiation between journals and Thomson Scientific. As a result of such negotiations, impact factor variations of more than 300% have been observed.[4] For instance, editorials in a journal do not count as publications. However when they cite published articles, often articles from the same journal, those citations increase the citation count for the article. This effect is hard to evaluate, for the distinction between editorial comment and short original articles is not obvious. "Letters to the editor" might refer to either class.
  • Several methods, not necessarily with nefarious intent, exist for a journal to cite articles in the same journal which will increase the journal's impact factor.[11]
  • An editor of a journal may encourage authors to cite articles from that journal in the papers they submit. The degree to which this practice affects the citation count and impact factor included in the Journal Citation Reports cited journal data must therefore be examined. Most of these effects are thoroughly discussed on the site's help pages, along with ways for correcting the figures for these effects if desired. However, it is almost universal for articles in a journal to cite primarily its own articles, for those are the ones of the same merit in the same special field. If done artificially, the effect will become especially visible when (i) journals have a low impact factor (in absolute terms) and (ii) publish only few papers per year.

In 2007, a specialist journal with an impact factor of 0.66 published an editorial that cited all its articles from 2005 to 2006 in a protest against the absurd use of the impact factor.[12] The large number of citations meant that the impact factor for that journal increased to 1.44.

Misuse

  • The impact factor is often misused to predict the importance of an individual publication based on where it was published.[13] This does not work well since a small number of publications are cited much more than the majority—for example, about 90 percent of Nature's 2004 impact factor was based on only a fourth of its publications, and thus the importance of any one publication will be different and on the average less than the overall number. The impact factor, however, averages over all articles and thus underestimates the citations of the top cited while exaggerating the number of citations of the average publication.
  • Academic reviewers involved in programmatic evaluations, particularly those for doctoral degree granting institutions, often turn to ISI's proprietary IF listing of journals in determining scholarly output. This builds in a bias which automatically undervalues some types of research and distorts the total contribution each faculty member makes.
  • The absolute value of an impact factor is meaningless by itself. A journal with an IF of 2 would not be very impressive in Microbiology, while it would in Oceanography. Such values are nonetheless sometimes advertised by scientific publishers.
  • The comparison of impact factors between different fields is invalid. Yet such comparisons have been widely used for the evaluation of not merely journals, but of scientists and of university departments. It is not possible to say, for example, that a department whose publications have an average IF below 2 is low-level. This would not make sense for Mechanical Engineering, where only two review journals attain such a value.
  • Outside the sciences, impact factors are relevant for fields that have a similar publication pattern to the sciences (such as economics), where research publications are almost always journal articles, that cite other journal articles. They are not relevant for literature, where the most important publications are books citing other books. Therefore, Thomson Scientific does not publish a JCR for the humanities.
  • Even though in practice they are applied this way, impact factors cannot correctly be the only thing to be considered by libraries in selecting journals. The local usefulness of the journal is at least equally important, as is whether or not an institution's faculty member is editor of the journal or on its editorial review board.
  • Though the impact factor was originally intended as an objective measure of the reputability of a journal (Garfield), it is now being increasingly applied to measure the productivity of scientists. The way it is customarily used is to examine the impact factors of the journals in which the scientist's articles have been published. This has obvious appeal for an academic administrator who knows neither the subject nor the journals.
  • The absolute number of researchers, the average number of authors on each paper, and the nature of results in different research areas, as well as variations in citation habits between different disciplines, particularly the number of citations in each paper, all combine to make impact factors between different groups of scientists incommensurable.[11] Generally, for example, medical journals have higher impact factors than mathematical journals and engineering journals. This limitation is accepted by the publishers; it has never been claimed that they are useful between fields—such a use is an indication of misunderstanding.
  • HEFCE (Higher Education Funding Council for England) was urged by the Parliament of the United Kingdom Committee on Science and Technology to remind Research Assessment Exercise (RAE) panels that they are obliged to assess the quality of the content of individual articles, not the reputation of the journal in which they are published.[14]

Summary

  • The number of citations to papers in a particular journal does not really directly measure the true quality of a journal, much less the scientific merit of the papers within it. It also reflects, at least in part, the intensity of publication or citation in that area, and the current popularity of that particular topic, along with the availability of particular journals. Journals with low circulation, regardless of the scientific merit of their contents, will never obtain high impact factors in an absolute sense, but if all the journals in a specific subject are of low circulation, as in some areas of botany and zoology, the relative standing is meaningful. Since defining the quality of an academic publication is problematic, involving non-quantifiable factors, such as the influence on the next generation of scientists, assigning this value a specific numeric measure cannot tell the whole story.
  • By merely counting the frequency of citations per article and disregarding the prestige of the citing journals, the impact factor becomes merely a metric of popularity, not of prestige.

Alternative measures

PageRank algorithm

In 1976 Gabriel Pinski and Francis Narin suggested a recursive impact factor, to give citations from journals that have high impact greater weight than citations from low-impact journals.[15] Such a recursive impact factor resembles the PageRank algorithm of the Google search engine, though the original Pinski and Narin paper uses a "trade balance" approach in which journals score highest when they are often cited but rarely cite other journals. A number of subsequent authors have proposed related approaches to ranking scholarly journals.[16][17][18] In 2006, Johan Bollen, Marko A. Rodriguez, and Herbert Van de Sompel also proposed using the PageRank algorithm.[19] From their paper:

ISI Impact Factor PageRank Combined
1 52.28 ANNU REV IMMUNOL 16.78 Nature 51.97 Nature
2 37.65 ANNU REV BIOCHEM 16.39 Journal of Biological Chemistry 48.78 Science
3 36.83 PHYSIOL REV 16.38 Science 19.84 New England Journal of Medicine
4 35.04 NAT REV MOL CELL BIO 14.49 PNAS 15.34 Cell
5 34.83 New England Journal of Medicine 8.41 PHYS REV LETT 14.88 PNAS
6 30.98 Nature 5.76 Cell 10.62 Journal of Biological Chemistry
7 30.55 Nature Medicine 5.70 New England Journal of Medicine 8.49 JAMA
8 29.78 Science 4.67 Journal of the American Chemical Society 7.78 The Lancet
9 28.18 NAT IMMUNOL 4.46 J IMMUNOL 7.56 NAT GENET
10 28.17 REV MOD PHYS 4.28 APPL PHYS LETT 6.53 Nature Medicine

The table shows the top 10 journals by ISI Impact Factor, PageRank, and a modified system that combines the two (based on 2003 data). Nature and Science are generally regarded as the most prestigious journals, and in the combined system they come out on top. That the New England Journal of Medicine is cited even more than Nature or Science might reflect the mix of review articles and original articles that it publishes. It is necessary to analyze the data for a journal in the light of a detailed knowledge of the journal literature.

The Eigenfactor is another PageRank-type measure of journal influence,[20] with rankings freely available at eigenfactor.org.

See also

Notes

  1. Eugene Garfield, A. I. Pudovkin, V. S. Istomin, Algorithmic Citation-Linked Historiography—Mapping the Literature of Science. Retrieved November 17, 2008.
  2. SLU, Impact Factor, Immediacy Index, Cited Half-life. Retrieved December 17, 2008.
  3. Web of Knowledge, Journal Citation Reports. Retrieved August 12, 2008.
  4. 4.0 4.1 4.2 PLoS, The Impact Factor Game. Retrieved December 17, 2008.
  5. Thomson Reuters, Journal Citation Report. Retrieved November 17, 2008.
  6. JCB, Show me the data, Journal of Cell Biology Retrieved December 17, 2008.
  7. Math Union, Citation Statistics, International Mathematical Union. Retrieved December 17, 2008.
  8. M. Bruynooghe, Theory and practice of logic programming and the ISI Web of Knowledge, Association for Logic Programming Newsletter 18 (4). Retrieved July 20, 2008.
  9. S.A. Marashi, On the identity of “citers:” Are papers promptly recognized by other investigators? Med. Hypotheses 65, 822.
  10. Richard Monastersky, The Number That's Devouring Science, The Chronicle of Higher Education. Retrieved December 17, 2008.
  11. 11.0 11.1 A. Fassoulaki, K. Papilas, A. Paraskeva, K. Patris, Impact factor bias and proposed adjustments for its determination Acta anaesthesiologica Scandinavica 46 (7): 902–5. PMID 12139549.
  12. Harm K. Schuttea, Jan G. Svec, Reaction of Folia Phoniatrica et Logopaedica on the Current Trend of Impact Factor Measures, Folia Phoniatrica et Logopaedica 59 (6): 281–285.
  13. BMJ, Why the impact factor of journals should not be used for evaluating research. Retrieved December 17, 2008.
  14. [House of Commons, Science and Technology—Tenth Report. Retrieved December 17, 2008.
  15. Gabriel Pinski and Francis Narin, Citation influence for journal aggregates of scientific publications: Theory with application to literature of physics, Information Processing & Management 12 (1976): 297–312. DOI 10.1016/0306-4573(76)90048-0.
  16. S. J. Liebowitz and J. P. Palmer. (1984). Assessing the relative impacts of economics journals. Journal of Economic Literature 22: 77–88.
  17. I. Palacios-Huerta and O. Volij (2004). The measurement of intellectual influence. Econometrica 72: 963–977.
  18. Y. K. Kodrzycki and P. D. Yu (2006). New approaches to ranking economics journals. B. E. Journal of Economics Analysis and Policy 5.
  19. Johan Bollen, Marko A. Rodriguez, and Herbert Van de Sompel. (December 2006). Journal Status. Scientometrics 69 (3).
  20. C. T. Bergstrom. (May 2007). Eigenfactor: Measuring the value and prestige of scholarly journals. C&RL News 68 (5).

References
ISBN links support NWE through referral fees

  • Bergstrom, C. T. May 2007. Eigenfactor: Measuring the value and prestige of scholarly journals. C&RL News 68 (5). Retrieved November 17, 2008.
  • Bollen, Johan, Marko A. Rodriguez, and Herbert Van de Sompel. December 2006. Journal Status. Scientometrics 69 (3). Retrieved November 17, 2008.
  • Bruynooghe M. 2005. Theory and practice of logic programming and the ISI Web of Knowledge. Association for Logic Programming Newsletter 18 (4). Retrieved November 17, 2008.
  • Fassoulaki A, Papilas K, Paraskeva A, Patris K (2002). Impact factor bias and proposed adjustments for its determination. Acta anaesthesiologica Scandinavica 46 (7): 902–5.
  • Garfield, Eugen. June 1998. The Impact Factor and Using It Correctly. Der Unfallchirurg 101 (6): 413–414. Retrieved November 17, 2008.
  • House of Commons. 9 Integrity of the publishing process. Retrieved November 17, 2008.
  • I. Palacios-Huerta and O. Volij. 2004. The measurement of intellectual influence. Econometrica 72: 963–977.
  • ISI Web of Knowledge. Journal Citation Reports. Thomson Reuters. Retrieved November 17, 2008.
  • Joint Committee on Quantitative Assessment of Research. "Citation Statistics." International Mathematical Union, June 12, 2008. Retrieved November 17, 2008.
  • Journal Citation Report, Thomson Reuters. Retrieved November 17, 2008.
  • Kodrzycki, Y. K. and P. D. Yu. 2006. New approaches to ranking economics journals. B. E. Journal of Economics Analysis and Policy 5.
  • Liebowitz, S. J. and J. P. Palmer. 1984. Assessing the relative impacts of economics journals. Journal of Economic Literature 22: 77–88. Retrieved November 17, 2008.
  • Marashi, S.A.. On the identity of “citers”: are papers promptly recognized by other investigators? (2005) Med. Hypotheses 65, 822. PubMed:15990244.
  • Monastersky, Richard. "The Number That's Devouring Science", The Chronicle of Higher Education, October 14 2005. Retrieved November 17, 2008.
  • Pinski, Gabriel, and Francis Narin. 1976. Citation influence for journal aggregates of scientific publications: Theory with application to literature of physics. Information Processing & Management 12: 297–312.
  • PLoS Medicine Editors."The Impact Factor Game", PLoS Medicine, June 6, 2006. Retrieved November 17, 2008.
  • Rossner, Mike, Heather Van Epps, and Emma Hill. "Show me the data", Journal of Cell Biology, December 17, 2007. Retrieved November 17, 2008.
  • Schuttea, Harm K., Jan G. Svec (2007). Reaction of Folia Phoniatrica et Logopaedica on the Current Trend of Impact Factor Measures. Folia Phoniatrica et Logopaedica 59 (6): 281–285.
  • Seglen PO. 1997. Why the impact factor of journals should not be used for evaluating research. BMJ 314 (7079): 498–502. Retrieved November 17, 2008.
  • Swedish University of Agricultural Sciences Libraries. Impact Factor, Immediacy Index, Cited Half-life. Retrieved November 17, 2008.

External links

All links retrieved November 17, 2008.

Credits

New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:

The history of this article since it was imported to New World Encyclopedia:

Note: Some restrictions may apply to use of individual images which are separately licensed.