0 of 0


Government-funded scientists here in the U.S. are a step closer to being able to resume some controversial experiments with lab-altered bird flu viruses.

Researchers in the Netherlands restarted their experiments a week ago, after scientists around the world declared an end in January to a year-long voluntary moratorium. But labs that depend on funding from the National Institutes of Health to do this work have had to wait for the agency to tell them what kind of studies can go proceed.

Now, in an article for the journal Science, officials with the NIH and the Department of Health and Human Services have unveiled a new policy for making those decisions.

At the same time, the White House Office of Science and Technology Policy is releasing a draft oversight system that would require research institutions and scientists to screen other kinds of especially risky experiments involving dangerous pathogens and toxins from a long list. The process could help prevent a repeat of the kind of public furor that took place over the bird flu studies.

The influenza experiments that caused such concern involve mutant forms of the bird flu virus H5N1. The wild virus circulates among poultry in parts of Asia and the Middle East and only rarely causes illness in people. But over half of those known to have gotten sick from it have died.

To understand how this virus might mutate in a way that let it start spreading from person to person, scientists at the University of Wisconsin-Madison and Erasmus Medical Center in the Netherlands made genetically-altered forms of the H5N1 virus. They found that these lab-created viruses could spread through the coughs and sneezes of ferrets, which are used in flu studies as stand-ins for people.

Critics have argued that these experiments were dangerous, because these newly created viruses could potentially start spreading and cause a pandemic if they ever got out of the lab.

The new decision-making policy on these types of H5N1 experiments aims at balancing the risks and benefits. It lays out a special review process and seven different criteria that proposed experiments would have to meet.

For example, scientists could only create a new H5N1 virus with enhanced transmissibility in mammals if the mutations could be produced through a natural evolutionary process. The proposed research would have to address a scientific question with high significance to public health, and there would have to be no feasible alternative methods to address that question. What's more, researchers would have to show they could mitigate the biosecurity and biosafety risks.

Amy Patterson, director of the office of science policy at the NIH, acknowledged that deciding whether or not a proposal meets these criteria will involve judgment calls. "I don't think that there's any way around that," she said. "These are not going to be necessarily black-and-white decisions that can be made."

She says if this system had been in place before the two controversial bird flu studies took place, both would have passed muster — but the tricky issues they raised would have gotten a lot more upfront consideration.

Virologist Ron Fouchier of Erasmus Medical Center, whose lab did one of these original experiments under a contract with the NIH, says he's restarted his work using another source of money. But he has also already submitted paperwork to the NIH to try to get approval under this new process to use that funding as well.

"We think that this work should be done, and many other scientists with us, and now we just have to see how this review panel will judge," says Fouchier.

"Some criteria make more sense than others, I think," says Fouchier. "How do you provide evidence that something that you do in the lab might actually happen in nature? I think we have offered strong evidence that what we did in the lab actually might happen in nature. That's really the toughest criterion to fulfill."

Stanford microbiologist David Relman says that whether a mutation might arise naturally is "a matter of conjecture," and he questions how relevant that question actually is to what researchers might propose to do in a lab.

"There's a big difference between a mutant virus arising amidst a pool of other, different viruses deep in the lung of a Siberian duck in the deepest part of Siberia where it will have a very little chance of encountering a human and a very different scenario in which someone makes this virus in very high concentrations, by itself, in a laboratory, where it may have many opportunities for entering or being exposed to a human," says Relman.

Another open question is how the requirement for lowering risk will be judged. "How far do you want to mitigate a risk? Zero risks do not exist. They never exist," says Fouchier.

Some say that in order to justify the risks of making new bird flu viruses with pandemic potential, the public health benefits have to be immediate and clear-cut. The skeptics aren't convinced that researchers have made the case.

"I have yet to hear a compelling rationale for why any of the proposed experiments is likely to be the best way to get a vaccine or a realistic way to improve surveillance," says Marc Lipsitch with the Harvard School of Public Health.

He thinks it's helpful that researchers now have a way of proposing these types of experiments. But he worries that the new policy might allow experiments with lab-altered bird flu strains that are "scientifically interesting but unlikely to change the way we confront the next flu pandemic. And those kinds of experiments, to me, are not worth the small but real risk of infection by accident or deliberately that goes along with them."

Besides the risk posed by these mutant H5N1 viruses themselves, some have worried that information about how to make them could potentially be misused in ways to deliberately cause harm. That's why, for months, scientists argued about whether the details of the experiments should be made public.

The research was finally published in science journals last year, but the debate drew new attention to the so-called dual-use problem — the idea that legitimate biological research or technologies might have a dark side.

Last March, the federal government adopted a policy for how it would oversee proposals for dual-use research with a list of high-risk pathogens and toxins. Now, officials have issued a draft plan that describes how federally-funded research institutions should identify and manage that kind of work as well.

Copyright 2016 NPR. To see more, visit http://www.npr.org/.