Begin typing your search...

    How to Protect the World From an Accidental Pandemic

    The potential role of a laboratory accident in causing the Covid pandemic remains uncertain and widely debated. But what is clear is that we still urgently need stronger government oversight of risky virus research.

    How to Protect the World From an Accidental Pandemic
    X

    T Inglesby, A Cicero and M Lipsitch

    It’s been about a century since viruses were recognized as causing devastating human diseases. Since then, scientists and public health experts have diligently tried to reduce the threats they pose by developing vaccines and treatments, improving ventilation and more.

    So it was stunning when, in 2012, scientists published papers describing how they had done the opposite: They had genetically engineered highly lethal avian flu viruses to make them more contagious between mammals, potentially including humans. The researchers said they pursued this work to deepen their scientific understanding of avian influenza.

    We were among the many experts around the world who objected to their research. The risk of an accidental or deliberate pandemic emerging from these enhanced viruses far outweighed any potential scientific benefit.

    In the six years before the Covid-19 pandemic, the U.S. government paused, then restarted, funding for such work, putting in place restrictions that although stronger than those in many other countries some researchers still seemed to find ways around. The rules lacked transparency about what research was being approved and funded, and they were insufficient to ensure safety and security.

    The potential role of a laboratory accident in causing the Covid pandemic remains uncertain and widely debated. But what is clear is that we still urgently need stronger government oversight of risky virus research.

    The U.S. government recently released a detailed new policy, which if fully implemented would establish strong oversight and set concrete rules about whether and under what conditions this kind of high-risk scientific work can be done. This work was initially called “gain of function” research, because much of it involved giving viruses new abilities. But many scientists recognized that term as vague and overly broad. The new policy corrects this misnomer and addresses oversight on scientific research involving pathogens with enhanced pandemic potential, or PEPP.

    This new policy, which needs to be in place and operating by May 2025, defines for researchers and their institutions which types of scientific work are risky enough to require special government review and approval. Any scientist who anticipates her research could create a more lethal or transmissible pathogen that risks causing a severe outbreak would need high-level federal review and approval for that work before proceeding.

    Experiments that interfere with or disrupt how the human immune system defends against a pathogen must also be scrutinized under this policy. Scientists working to resurrect extinct or eradicated viruses that caused past epidemics or pandemics must similarly obtain permission from federal officials.

    The policy eliminates some exemptions that might have offered a path for avoiding high-level review in the past, and it requires the signature of an accountable federal official for the work to start. We think the new policy is robust, and a clear improvement in both scope and content from past U.S.

    efforts to reduce risks posed by this kind of research. The U.S. government should actively persuade other countries to set up similar systems to ensure this realm of science is safely governed worldwide.

    Still, the new policy doesn’t solve all of the challenges. There are gaps that need to be addressed, either as the White House evaluates and amends the policy during its implementation this year or by Congress as it exercises its oversight on it. One concern is that the policy does not apply to all research, only research funded by the U.S.

    government or otherwise required to comply by a federal oversight policy. Congress should require that this policy apply for all PEPP research activities, regardless of funding source. The federal government also needs to set expectations and rules for scientists who use computational tools in ways that could potentially create pandemic-level risks. When and how the government should evaluate such tools still needs to be established, and it’s especially important to do so as artificial intelligence enhances our capabilities to create toxins and designer microbes that may present new risks.

    To build trust that the policy is actually working, the process needs to be much more transparent. As currently written, the policy plans to publish a single annual aggregate report across the government.

    Much better would be quarterly reporting with summaries of PEPP research projects that are proposed under any funding agency, a description of how proposals are being evaluated and a rationale for the decisions that are made. Concerns about intellectual property, while real, are outweighed by the need to tell the public how it’s being safeguarded from deadly disease.

    There has been increased scrutiny around the National Institutes of Health’s role in the review process for PEPP research that the agency is considering funding. It would be unwise for Congress to eliminate the involvement of federal funding agencies, such as the N.I.H., in the review process. There is substantial expertise and experience in the N.I.H. on these issues, and that should be brought to bear as part of the review process.

    The N.I.H. and other agencies must eliminate any real or perceived conflicts of interest of officials involved in deciding which research gets reviewed, and in performing the reviews. This new policy makes clear that final decisions regarding PEPP proposals made to the N.I.H. will be made by senior U.S. Health and Human Services officials outside of the agency, which we support.

    Researchers who want to make pathogens more transmissible, lethal or capable of avoiding the human immune system should bear great responsibility for first proving several things: that this work is critical to perform, that the public health benefits aren’t achievable through safer approaches and that these benefits outweigh the extraordinary potential risks that would be part of it. They also need to have the competency and expertise, as well as the laboratory safety protocols and engineering controls in place to prevent any accidents or accidental infections.

    This new policy is not perfect, but it represents the most robust governance and clearest set of rules in the world for this kind of science. Strong, effective oversight for this research should be a priority for the United States and the world. None of us should bear the risk of living through a future pandemic of humanity’s own making.

    NYT Editorial Board
    Next Story