U.S. weighs crackdown on experiments that could make viruses more dangerous

In a U.S. government lab in Bethesda, Maryland, virologists plan to equip the strain of the monkeypox virus that spread globally this year, causing mostly rash and flulike symptoms, with genes from a second monkeypox strain that causes more serious illness. Then they’ll see whether any of the changes make the virus more lethal to mice. The researchers hope that unraveling how specific genes make monkeypox more deadly will lead to better drugs and vaccines.

Some scientists are alarmed by the planned experiments, which were first reported by Science. If a more potent version of the outbreak strain accidentally escaped the high-containment, high-security lab at the National Institute of Allergy and Infectious Diseases (NIAID), it could spark an “epidemic with substantially more lethality,” fears epidemiologist Thomas Inglesby, director of the Center for Health Security at the Johns Hopkins University Bloomberg School of Public Health. That’s why he and others argue the experiments should undergo a special review required for especially risky U.S.-funded studies that might create a pathogen that could launch a catastrophic pandemic.

But it’s not clear that the rules apply to the proposed study. In a 2018, a safety panel determined it was exempt from review. Monkeypox did not meet the definition of a “potential pandemic pathogen” (PPP), the panel decided, because it didn’t spread easily. Now, with monkeypox widespread, the National Institutes of Health (NIH) is planning to reexamine the work, but it still might not qualify as “enhancing” a PPP, the agency says. That’s because the study will swap natural mutations, not create new ones, so it is not expected to create a monkeypox strain more virulent than the two already known.

The monkeypox controversy marks just the latest flare-up in a decade-old debate over exactly when a study that alters a pathogen is too risky for the U.S. government to fund—and who should have the power to decide. That wrangling became especially ferocious over the past 2 years, as the COVID-19 pandemic spawned allegations, so far unproven, that SARS-CoV-2 escaped from a laboratory in China. Now, in the pandemic’s wake, the U.S. government appears poised to make sizable changes to how it manages so-called gain-of-function (GOF) studies that tweak pathogens in ways that could make them spread faster or more dangerous to people.

quotation mark

There are significant potential risks to both under- and overregulation.

  • Jesse Bloom
  • Fred Hutchinson Cancer Center

Last month, an expert panel convened by NIH and its parent agency, the Department of Health and Human Services (HHS), released a draft report that recommends the GOF rules be broadened to include pathogens and experiments that are exempt from the current scheme. If the recommendation is adopted—which could come next year—the monkeypox study could come under tighter scrutiny. And other researchers working with viruses such as Ebola, seasonal flu strains, measles, and even common cold viruses could face new oversight and restrictions.

Some scientists are watching nervously, worried that an expanded definition could worsen what they already see as a murky, problematic oversight system. The existing rules, they say, have caused confusion and delays that have deterred scientists from pursuing studies critical to understanding emerging pathogens and finding ways to fight them. If not implemented carefully, the proposed changes could “greatly impede research into evolving or emerging viruses,” worries virologist Linda Saif of Ohio State University, Wooster. She and others say expanding the regulations could add costly red tape, potentially driving research overseas or into the private sector, where U.S. regulations don’t apply or are looser.

Others say the proposed changes don’t go far enough. They’d like to see the U.S. government create an entirely new independent body to oversee risky research, and for the public to get far more information about proposed experiments that could have fearsome consequences. Some have even called for curbing the now common practice of collecting viruses from wild animals and studying them in the lab, saying it only increases the risks that the viruses—or modified versions—will jump to humans.

“We really should be asking important questions about whether that work should continue,” Inglesby says. And virologist James LeDuc, who retired last year as director of the University of Texas Medical Branch’s Galveston National Laboratory, says, “It’s one thing to recognize that these viruses exist in nature. It’s another to modify them so that you can study them if in fact they could become human pathogens.”

All sides agree on one thing: The proposed rules represent a potential pivot point in the debate over the funding of high-risk GOF studies by the U.S. government, which is one of the world’s largest supporters of virology research. “There are significant potential risks to both under- and overregulation in this field,” says virologist Jesse Bloom of the Fred Hutchinson Cancer Center, who like LeDuc is part of a group of scientists pushing for the changes. “The goal,” he adds, “needs to be to find the right balance.”

The controversy over studies that enhance or alter pathogens ignited a decade ago, but such work goes back more than a century. To make vaccines, for example, virologists have long passaged, or repeatedly transferred, a virus between dishes of animal cells or whole animals, so that it loses its ability to harm people but grows better—a gain of function. Since the late 1990s, genetic engineering techniques have made these studies much more efficient by allowing virologists to assemble new viral strains from genomic sequences and to add specific mutations.

In 2011, two such NIH-funded experiments with H5N1 avian influenza set off alarm bells worldwide. Virologists Yoshihiro Kawaoka at the University of Wisconsin, Madison, and the University of Tokyo and Ronald Fouchier at Erasmus University Medical Center were interested in identifying mutations that could enable the virus, which normally infects birds, to also spread easily among mammals, including humans. Small but frightening outbreaks had shown H5N1 could spread from birds to people, killing 60% of those infected. By introducing mutations and passaging, Kawaoka and Fouchier managed to tweak the virus so it could spread between laboratory ferrets, a stand-in for humans.

Controversy erupted after Fouchier discussed the work at a scientific meeting prior to publication. Soon, worries that the information could land in the wrong hands or that the tweaked virus could escape the lab prompted journal editors and government officials to call for a review by an HHS panel called the National Science Advisory Board for Biosecurity (NSABB). HHS established NSABB after the 2001 anthrax attacks in the United States to consider so-called dual use research that could be used for both good and ill. During the review, flu researchers worldwide voluntarily halted their GOF experiments. Ultimately, NSABB concluded the scientific benefits of the studies outweighed the risks; the H5N1 papers were published and the work resumed.

Then in mid-2014, several accidents at U.S. labs working with pathogens, along with worries about some new GOF papers, prompted the White House to impose a second “pause” on U.S.-funded GOF research. It halted certain studies with influenza and the coronaviruses that cause Middle East respiratory syndrome (MERS) and severe acute respiratory syndrome (SARS), SARS-CoV-2 cousins that have caused small though deadly outbreaks. NIH ultimately identified 29 potential GOF projects in its funding portfolio. After reviews, the agency allowed 18 to resume because it determined they didn’t meet the risky GOF definition or were urgent to protect public health. Some, for example, adapted MERS to infect mice, a step that can help researchers develop treatments. The remaining 11 studies had GOF components that were removed or put on hold.

During the second pause, U.S. officials promised to come up with a more comprehensive approach to identifying and potentially blocking risky studies before they began. Advocates of tighter rules also pushed for less-risky approaches for studying altered viruses, such as using weakened virus strains, computer models, or “pseudoviruses” that can’t replicate.

Many virologists, however, argued that only studies with live virus can accurately show the effect of a mutation. “There’s only so much you can learn [from alternative techniques],” says University of Michigan, Ann Arbor, virologist Michael Imperiale, who supported the H5N1 GOF studies. “Sometimes using intact virus is the best approach.”

An unfolding debate

Researchers have long conducted gain-of-function (GOF) research that gives viruses and other pathogens new capabilities. But a decade ago, studies that enabled H5N1 avian flu to more easily spread among mammals kicked off a debate that continues today over how tightly the United States should regulate such research.

Scroll left and right to view the timeline.

 

 

V. Altounian/Science

In 2017, the debate culminated with the release of the current HHS policy, dubbed Potential Pandemic Pathogens Care and Oversight (P3CO). It requires that an HHS panel review any NIH-funded study “reasonably anticipated” to create or use an enhanced version of an already highly virulent, highly transmissible pathogen that might cause a pandemic. But it exempts natural, unmodified viruses and GOF work done to develop vaccines or as part of surveillance efforts, such as tweaking a circulating flu virus to assess the risks of a newly observed variant.

The HHS committee charged with implementing the policy, which operates behind closed doors, has since reviewed only three projects, and approved all. Two were continuations of Kawaoka’s and Fouchier’s H5N1 work. (Both grants are now expired.) The third involved work with H7N9 avian influenza, but the investigator later agreed to use a nonpathogenic flu strain.

Other concerning studies have been given a pass, critics say. As an example, they point to work led by coronavirus expert Ralph Baric of the University of North Carolina, Chapel Hill. In the 2000s, his team became interested in determining whether bat coronaviruses had the potential to infect humans. (COVID-19 has since shown the answer is emphatically yes.) But the researchers often could not grow the viruses in the laboratory or enable them to infect mice. So they created hybrid, or chimeric, viruses, grafting the gene encoding the surface protein, or “spike,” that the wild bat virus uses to enter a host cell into a SARS strain that infects mice.

NIH let this work continue during the 2014 pause. The researchers had no intention of making the mouse-adapted SARS virus more risky to people, Baric has said. But something unexpected happened when his lab added spike from a bat coronavirus called SHC014: The chimeric virus sickened mice carrying a human lung cell receptor, Baric’s team reported in 2015 in Nature Medicine. The hybrid virus could not be stopped by existing SARS antibodies or vaccines. In essence, critics of the work assert, it created a potential pandemic pathogen.

A review panel might “deem similar studies building chimeric viruses based on circulating [bat coronavirus] strains too risky to pursue,” Baric acknowledged. Yet he has also called these chimeric viruses “absolutely essential” to efforts to test antiviral drugs and vaccines against coronaviruses, and many virologists agree. They also argue that Baric’s work and related experiments provided an early warning that, if heeded, might have helped the world prepare for the COVID-19 pandemic.

The pandemic has supercharged the GOF debate, in large part because of unproven but high-profile allegations—including from former President Donald Trump—that SARS-CoV-2 emerged from a laboratory in Wuhan, China. One prominent advocate of the lab-leak theory, Senator Rand Paul (R–KY), a senior member of the Senate’s health panel, has sparred with NIAID Director Anthony Fauci over experiments in virologist Shi Zhengli’s lab at the Wuhan Institute of Virology (WIV). With money from an NIH grant to a U.S. nonprofit organization, the EcoHealth Alliance, Shi had created chimeras by adding spike proteins from wild bat coronaviruses to a SARS-related bat strain called WIV1. The WIV researchers used methods developed by Baric, who has collaborated with Shi.

Last year, documents obtained by the Intercept showed that—like Baric’s work during the 2014 pause—NIH had exempted the EcoHealth grant from the P3CO policy. (The agency later explained that the bat coronaviruses were not known to infect humans.) But NIH also said that if Shi’s lab observed a 10-fold increase in a chimeric virus’ growth compared with WIV1, it wanted to be informed, because the experiments could then require a P3CO review.

The documents show WIV did observe increased growth in the lungs of infected mice and more weight loss and death in some animals. NIH has said EcoHealth failed to report these “unexpected” results promptly as required, but EcoHealth disputes this. Paul and some proponents of the lab-leak theory have gone further, alleging that NIH actively conspired with EcoHealth to hide the risks of the study.

As often is the case in GOF debates, there is no scientific consensus on whether the WIV experiments—or the results—crossed a red line. Paul and some scientists have fiercely argued that they were unacceptably risky. Others forcefully disagree. NIH officials, meanwhile, have emphasized that the hybrid viruses created by Shi’s lab were genetically too distant from SARS-CoV-2 to have generated the pandemic.

Even as NIH officials have defended their assessment of the EcoHealth grant, they have conceded the pandemic made it clear that the GOF rules needed a fresh look. In February, NIH asked NSABB to broaden an existing review of the P3CO policy, launched in January 2020 to examine ways to increase transparency in the review board’s membership and deliberations. Now, NSABB had bigger issues to weigh: Some White House officials even wanted the panel to consider whether the United States should simply ban funding for some kinds of GOF studies.

quotation mark

I worry that people will [fear] accidentally tripping a wire.

  • Gigi Kwik Gronvall
  • Johns Hopkins University

In September, an NSABB working group released a draft report that does not go that far. It does recommend that GOF work done for vaccine development and pathogen surveillance no longer be automatically exempt from P3CO review. It also recommends that the definition of a pathogen that triggers a review be significantly expanded to include two new categories not explicitly covered by the current rules.

One category would sweep in “potentially highly transmissible pathogens having low or moderate virulence or case-fatality rates.” That definition would cover SARS-CoV-2, which studies suggest kills about 1% or less of infected people. It also could include tuberculosis bacteria, measles, seasonal flu, and the noroviruses that cause stomach bugs, Saif and others suggest.

The second category would include pathogens that are “less transmissible” but have “higher virulence or case-fatality rates.” That definition could include rabies, the Nipah virus spread by fruit bats, and Ebola, which is deadly but isn’t easily transmitted because it’s spread through blood or other body fluids.

Even with the new rules, determining whether a pathogen or experiment fits into a reviewable category will remain a judgment call. Predicting whether a virus can become “highly transmissible,” for example, can be difficult. So can defining “low or moderate” virulence, acknowledges working group co-chair Syra Madad, an epidemiologist at New York University, speaking in a personal capacity. Policymakers should provide illustrative examples, her panel said. Its final recommendations are due out in December or January 2023.

Some researchers worry this subjectivity will deter researchers from pursuing valuable pathogen science, for fear they’ll get entangled in red tape. “When things are unpredictable, I worry that people will avoid going close to the line for fear of accidentally tripping a wire,” says Gigi Kwik Gronvall, a biosecurity specialist at Johns Hopkins.

Other scientists, however, say even an expanded policy could be too lax. Shi’s WIV1 chimeric virus experiments, for example, still might not qualify for review because the starting viruses weren’t known to cause human disease. And the NIAID monkeypox studies may not qualify because they aren’t creating new genes. Still, the gene swapping is “like changing the machinery of a clock where you have a lot of different pieces that work together. We don’t know exactly how it is going to work,” says monkeypox virologist Gustavo Palacios of the Icahn School of Medicine at Mount Sinai.

To close some gaps, a group of GOF critics organized by Inglesby has urged NSABB to expand the review requirements to include GOF studies of any pathogen, however harmless, that could be manipulated to become a PPP. And others have urged that the reviews be conducted by a new, independent agency rather than HHS, which they argue has been reluctant to aggressively regulate studies it funds through NIH.

Currently, NIH is funding at least 11 grants that likely should have gone through P3CO review but did not, estimates molecular biologist Richard Ebright of Rutgers University, Piscataway, a prominent GOF critic who has surveyed the agency’s grant abstracts. (He says full proposals, which are typically not public, would verify his estimate.) They involve eight institutions in the United States, most studying flu, SARS, and MERS. His list includes a currently funded grant proposal by EcoHealth that describes plans for further bat coronavirus chimera work in Baric’s lab.

But a broader P3CO policy will affect “still a pretty small area” of research, suggests Lyric Jorgenson, acting director of the NIH Office of Science Policy. And this time, she does not expect another “crippling” shutdown of experiments while they are reviewed.

A U.S. clampdown will have no sway over privately funded GOF research or what happens in other countries, which typically lack policies like the P3CO framework. In Japan and most of Europe, for example, oversight is limited to rules on biosafety and, sometimes, biosecurity along with voluntary self-regulation, say biosecurity experts Gregory Koblentz of George Mason University and Filippa Lentzos of King’s College London. It’s too soon to say how a 2020 Chinese biosafety law will affect PPP research, they say.

Such rules have not prevented GOF work that some researchers consider too risky. For example, since 2018 labs in China have published at least three papers in journals describing experiments with potential pandemic bird flu strains that Bloom thinks might have crossed the line because they added mutations for drug resistance or adaptation to mammals. None, however, was “as alarming as the earlier Fouchier or Kawaoka [H5N1] studies,” says Bloom, who examined the papers for Science.

A study described in a June preprint by a team at the Pasteur Institute has also drawn concerns. The scientists passaged a bat coronavirus from Laos that is a distant cousin of SARS-CoV-2 through human cells and in mice to see whether it acquired a specific mutation that would help it infect people. The virus did not—a finding that some scientists said sheds light on how the COVID-19 pandemic began. But others told The New York Times that the work, which was reviewed by a local biosafety committee, might not have been worth the risk.

Meanwhile, a growing number of laboratories around the world are jumping into the field. In an interview with the MIT Technology Review last year, for example, Baric noted that just three or four labs were engineering bat coronaviruses before the pandemic, but that number has since multiplied. The expansion is “unsettling,” he said, because some “inexperienced” groups could proceed “with less respect for the inherent risk posed by this group of pathogens.” (Baric could not be reached for this story.)

Some GOF critics hope to launch a broader global dialogue about how to regulate high-risk pathogen studies. Bloom and Lentzos are part of the Pathogens Project, a 1-year taskforce launched in September by the Bulletin of the Atomic Scientists, best known for its Doomsday Clock warning of threats such as nuclear war. The project will gather international experts, including University of Cambridge microbiologist Ravindra Gupta, who has advised the United Kingdom’s COVID-19 response, and George Gao, former director of China’s Center for Disease Control and Prevention, to hammer out recommendations for working safely with risky pathogens.

Co-chair and microbiologist David Relman of Stanford University says, “The idea is to reach out and try to find a broad set of interested parties from across the globe and ask, what are the key questions? What are some possible actions? Is there an appropriate international entity right now that could take this on?” Those may be modest goals, he says, but it’s a start.

source: sciencemag.org