(Illustration by iStock/Tetiana Lazunova)

The volume and breadth of COVID-19 research now underway and being shared widely is overwhelming: clinical studies and research reports on host genetic analysis, diagnostic modalities, antiviral drug and vaccine development, and socio-economic determinants. Since January, the major biomedical preprint servers, bioRxiv and medRxiv, have posted more than 9,000 papers related to SARS-CoV-2. To cope with the unprecedented pace of research, prominent journals like Science and The Lancet have become powerful gatekeepers. After all, published, peer reviewed science plays a critical role informing policy on life and death matters: the adoption of new drug treatments, clinical guidelines in our hospitals and doctors’ offices, and new methods to avoid infection as we reopen our economy.

However, the pandemic crisis has exposed cracks in the foundations of traditional publishing models. For example, both The Lancet and the New England Journal of Medicine have been forced to retract highly-influential research studies by credible academics at prestigious institutions, when the data was deemed unreliable.  “They were unable to complete an independent audit of the data underpinning their analysis,” the retraction notice in The Lancet reads. “As a result, they have concluded that they ‘can no longer vouch for the veracity of the primary data sources.’”

Rethinking Social Change in the Face of Coronavirus
Rethinking Social Change in the Face of Coronavirus
    In this series, SSIR will present insight from social change leaders around the globe to help organizations face the systemic, operational, and strategic challenges related to COVID-19 that will test the limits of their capabilities.

    These retractions not only highlight the effects that a mounting pressure to publish quickly can have but underscore a lack of transparency and accountability that can sometimes result from the widespread use of anonymized and opaque peer review.

    Open, preprint archives—where researchers post studies that have not undergone scientific peer review—are an important way to create more timely, open, and inclusive access to new science. However, pre-prints of COVID-19 research—now numbering more than 20,000 in circulation—can lead the media, policy-makers, clinicians, and the general public to assume that the work has been formally vetted, prior to being posted. In a recent case, an unreviewed paper claiming an anti-parasitic drug could treat coronavirus was widely promoted in Latin America. Researchers withdrew the pre-print, citing the need for further analysis, but there were already reports of people rushing to pharmacies and self-administering the unproven medication.

    We need a transformation in how early data is shared. But the urgent need for peer-reviewed science, coupled with the potential harms of unreviewed publication, has set the stage for a public discussion on the future of academic publishing. It’s clear that we need rapid, transparent peer review that allows reviewers, authors, and readers to engage with one another, and for dynamic use of technology to accelerate publishing timelines without reducing academic rigor or researcher accountability. However, the field of academic publishing will need significant financial support to catalyze these changes.

    Philanthropic organizations, as longtime supporters of scientific research, must be at the vanguard of the effort to fund improvements in how science is curated, reviewed, and published. When the MIT Press first began to address the need for the rapid dissemination of COVID-19-related research and scholarship—by making a selection relevant e-books and journal articles freely available, as well as developing a new, rapid publication model for books, under the imprint First Reads—senior staff were interested in undertaking bolder efforts to address the specific problems engendered by the pandemic. The proliferation of preprints related to COVID-19 was already apparent, as was the danger of un-vetted science seeding mainstream media stories with deleterious results.

    Rapid Reviews: COVID-19 (RR:C19) is an innovation in open publishing that allows for rigorous, transparent peer review that is publicly shared in advance of publication. We believe that pushing the peer review process further upstream—so that it occurs at the preprint stage—will benefit a wide variety of stakeholders: journalists, clinicians, researchers, and the public at large.  

    The process of creating RR:C19 began when Stefano Bertozzi, former Dean of the School of Public Health at UC Berkeley, accepted our invitation to serve as the Editor in Chief and his colleague Hildy Fong Baker, Executive Director of the UC Berkeley Center for Global Public Health, was recruited as the Managing Editor. MIT and UC Berkeley worked with the Patrick J. McGovern Foundation to fund the development and launch of the journal, providing the necessary support to commence work in May.

    To create a robust, rapid-response publishing model, RR:C19 uses the natural language processing tool COVIDscholar developed at the Lawrence Berkeley National Lab. The RR:C19 editorial office uses natural language processing (NLP) to "power search" COVID-19 papers. The COVIDScholar team has also helped the RR:C19 team create a platform and interface leveraging the preprint data, so we can rapidly review and fast track papers for peer review. For example, the COVIDScholar technology updates our interface in near real-time to reflect new preprints uploaded to select preprint servers, and has a machine-learning algorithm that helps us prioritize preprints to review.  It's important to note that RR:C19 leverages both the interface and a network of scholars and researchers to carefully filter current preprints for peer review. 

    By offering peer review of preprints, our goal is to help staunch the uptake of misinterpretations and disinformation and accelerate the uptake of validated science, so that clinicians, researchers, and policy-makers can make sound, evidence-based decisions.

    Since the initial rollout of peer reviews in mid-August, RR:C19 has been able to continuously review and select important and intriguing “preprints of the week” to review. With nearly 30 reviewed articles with two peer reviewers per article to date, RR:C19 has been successful in devising workflows to support the process—a complex operation—but more importantly, it has successfully reviewed critical preprints in a timely manner. Peer reviewers are selected by the RR:C19 editorial office. Upon choosing a preprint, we have domain coordinators and specialists select experts and researchers who are qualified to review each preprint. We conduct extensive research to find the right reviewers for each preprint. We also rely heavily on our own network of scholars to identify the best reviewers. 

    For example, RR:C19 released reviews of the reopening of universities during the COVID-19 pandemic prior to the start of the Fall semester at most American universities. “Electoral Repercussions of a Pandemic” reviews have been released, with the presidential election approaching in two months. Reviews on the largest seroprevalence study in Brazil were published within days of its pick-up across various media outlets. The model has also allowed RR:C19 to stay on top of studies addressing advances in technology in medical, biochemical and engineering science domains--providing preliminary assessments on the reliability of studies investigating antibodies, antiviral therapies, vaccines, targeted therapies, bioengineering techniques, rapid tests, and cellular response of COVID-19 patients.

    The new journal aspires to provide a proof-of-concept that promises to transform academic publishing by harnessing the potential of artificial intelligence and machine learning.

    With this and future efforts, we’ve identified five key opportunities to align academic publishing priorities with the public good:

    1. Transparency: Redesign and incentivize the peer review process to publish all peer reviews alongside primary research, reducing duplicate reviews and allowing readers and authors to understand and engage with the critiques.
    2. Accountability: The roles of various authors on any given manuscript should be clearly defined and presented for the readers. When datasets are used, one or more of the authors should have explicit responsibility for verifying the integrity of the data and should document that verification process within the paper’s methodology section.
    3. Urgency: Scientific research can be slow moving and time consuming. Publishing data does not have to be. Publishing houses should build networks of experts who are able to dedicate time to scrutinizing papers in a timely manner with the goal of rapid review with rigor.
    4. Digital-First Publishing: While science is a dynamic process of continued learning and exploration, much of scientific publishing conforms to outdated print models. Academic journals should explore opportunities to deploy AI-powered tools to identify peer-reviewers or preprint scholarship and digital publishing platforms to enable more visible communication and collaboration about research findings. Not only can reviews be closer to real-time, but authors can easily respond and modify their work for continuous quality improvement.
    5. Funding: Pioneering new solutions in academic publishing will require significant trial and error, at a time when traditional business models such as library subscriptions are in decline. Philanthropies should step forward to provide catalytic risk financing, testing new models and driving social good outcomes.

    In our efforts to date, RR:C19 has harnessed these key opportunities to advance academic publishing in a dynamically evolving landscape. It has required extraordinary levels of collaboration, goodwill, and innovation, as well as the engagement of a future generation of peer reviewers, and the energy of students who are eager to contribute to the wider effort. In this sense, the potential to respond to the current pandemic is also the potential to train and motivate students and early career scientists.

    Support SSIR’s coverage of cross-sector solutions to global challenges. 
    Help us further the reach of innovative ideas. Donate today.

    Read more stories by Vilas Dhar, Amy Brand & Stefano Bertozzi.