Statement: Publication practices and indices and the role of peer review in research assessment

Progress in science, and confidence in the scientific process, depends on the accuracy and reliability of the scientific literature. This in turn depends on the rigour of the manuscript review process. In addition to ensuring the quality of scientific publications, independent peer review is also a critical part of the evaluation process for both individual scientists and research institutes.

The ICSU Committee for Freedom and Responsibility in the conduct of Science (CFRS) is concerned that some of the policies and practices currently being adopted by scientific institutions and journal publishers may inadvertently be undermining the integrity of the scientific literature. This is compounded by the uncritical use of publication metrics, as a replacement for independent peer-review, in evaluating scientific performance. By pointing out these concerns to ICSU Member organizations, it is hoped that they will take action to ensure the quality of the scientific record and promote a cautious and critical approach to the use of publication metrics in research assessment.

Issues of concern:

CFRS is concerned that current policies and practices may be having serious effects on the quality of scientific work in general, and increasing the burden on journal reviewers. Any unnecessary increase in the volume of scientific publications threatens a proper reviewing process, which is essential for maintaining standards of accuracy and integrity.

In addition to its role in scientific publishing, the Committee regards rigorous and unbiased peer review as being the most important mechanism for judging the quality of scientific work and scientific projects. Establishing and maintaining good peer review processes is in itself a challenge and it is recognised that there can be benefits in using quantitative measures and metrics as complements to this process. However, the apparent simplicity and attraction of such numerical indices should not conceal their potential for manipulation and misinterpretation and they should therefore only be used with considerable caution.

Because norms for publication number, authorship conventions, and citations differ from field to field, judgements and policy are often best made by peers with expertise in the same area. CFRS urges ICSU’s member organisations to stimulate discussion of scientific evaluation criteria, career indicators and publication records, with the aim of promoting a system that can better serve science in general. Rather than learning to survive in a ‘publish or perish’ culture, young scientists should be encouraged and supported to produce high quality scientific communications that make a real contribution to scientific progress.

Questions for consideration, regarding the use of metrics in assessing research performance, include:

*See, for example, Ross et al, JAMA 299, 1800-1812 (2008).

Addendum

Whilst this statement was being developed, the International Mathematical Union released the Citation Statistics report, which is a detailed critical analysis of the use of citation data for scientific evaluation. A main conclusion of this report is: While having a single number to judge quality is indeed simple, it can lead to a shallow under-standing of something as complicated as research. Numbers are not inherently superior to sound judgments.

Citation Statistics report


About this statement

This statement is the responsibility of the Committee on Freedom and Responsibility in the conduct of Science (CFRS) which is a policy committee of the International Council for Science (ICSU). It does not necessarily reflect the views of individual ICSU Member organizations.

Download the Statement

VIEW ALL RELATED ITEMS

Skip to content