The peer review system no longer works to guarantee academic rigour - a different approach is needed
Peer review is a central feature of academic work. It’s the process through which research ends up published in an academic journal: independent experts scrutinise the work of another researcher in order to recommend if it should be accepted by a publisher and if and how it should be improved.
Peer review is often assumed to guarantee quality, but it doesn’t always work well in practice. Every academic has their own peer-review horror stories, ranging from years-long delays to multiple tedious rounds of revisions. The cycle continues until the article is accepted somewhere or until the author gives up.
On the other side, the work of reviewing is voluntary and also invisible. Reviewers, who often remain anonymous, go unrewarded and unrecognised, even though their work is an essential part of research communication. Journal editors find recruiting peer reviewers is increasingly difficult.
And we know peer review, however much it is lauded, often does not work. It is sometimes biased, and too often allows errors, or even scholarly fraud, to creep through.
Clearly the peer-review system is broken. It is slow, inefficient and burdensome, and the incentives to carry out a review are low.
Publish first
In recent years, alternative ways to scrutinise research have emerged which attempt to fix some of the problems with the peer-review system. One of these is the “publish, review, curate” model.
This reverses the traditional review-then-publish model. An article is first published online, then peer reviewed. While this approach is too new to understand how it compares with traditional publishing, there is optimism about its promise, suggesting that increased transparency in the review process would speed scientific progress.
We have set up a platform using the publish, review, curate model for the field of metaresearch – research about the research system itself. Our aims are both to innovate peer review in our field and to study this innovation as a metaresearch experiment of sorts. This initiative will help us to understand how we can improve peer review in ways that we hope will have implications for other fields of research.
The platform, called MetaROR (MetaResearch Open Review), has just been launched. It is a partnership between an academic society, the Association for Interdisciplinary Meta-Research and Open Science, and a non-profit metaresearch accelerator, the Research on Research Institute.
In the case of MetaROR, authors first publish their work on a preprint server. Preprints are versions of research papers made available by their authors before peer review as a way of accelerating the dissemination of research. Preprinting has been common in a few academic disciplines for decades, but increased in others during the pandemic as a way of getting science into the public domain faster. MetaROR, in effect, builds a peer-review service on top of preprint servers.
Authors submit their work to MetaROR by providing MetaROR with a link to their preprinted article. A managing editor then recruits peer reviewers who are experts on the article’s object of study, its research methods, or both. Reviewers with competing interests are excluded whenever possible, and disclosure of competing interests is mandatory.
Peer review is conducted openly, with the reviews made available online. This makes the work of reviewers visible, reflecting the fact that review reports are contributions to scholarly communication in their own right.
We hope that reviewers will increasingly see their role as engaging in a scholarly conversation in which they are a recognised participant, although MetaROR still allows reviewers to choose whether to be named or not. Our hope is that most reviewers will find it beneficial to sign their reviews and that this will significantly reduce the problem of anonymous dismissive or otherwise bad-faith reviews.
Since articles submitted to MetaROR are already publicly available, peer review can focus on engaging with an article with a view to improving it. Peer review becomes a constructive process, rather than one that valorises gatekeeping.
Evidence suggests preprints and final articles actually differ surprisingly little, but improvements can often be made. The publish, review, curate model helps authors engage with reviewers.
Following the review process, authors are left to decide whether to revise their article and how. In the MetaROR model, authors can also choose to submit their article to a journal. To offer authors a streamlined experience, MetaROR is collaborating with several journals who commit to using MetaROR reviews in their own review process.
Like other publish, review, curate platforms, MetaROR is an experiment. We will need to evaluate it to understand its successes and failures. We hope others will too, so we can learn how best to organise the dissemination and evaluation of scientific research – without, we hope, too many peer-review horror stories.
Stephen Pinfield is an Editor of MetaROR and a Senior Research Fellow of the Research on Research Institute.
Kathryn Zeiler is Editor-in-Chief of MetaROR and a member of board of the Association for Interdisciplinary Meta-Research and Open Science.
Ludo Waltman is Editor-in-Chief of MetaROR and a Co-chair of the Research on Research Institute.