With a Bayesian perspective, the uncertainty is encoded into randomness. The researchers began by supposing that the reproductive number had various distributions (the priors). Then they modeled the uncertainty using a random variable that fluctuates, taking on a range of values as small as 0.6 and as large as 2.2 or 3.5. In something of a nesting process, the random variable itself has parameters that fluctuate randomly; and those parameters, too, have random parameters (hyper-parameters), etcetera. The effects accumulate into a “Bayesian hierarchy” — “turtles all the way down,” Dr. Holmes said.

The effects of all these up-and-down random fluctuations multiply, like compound interest. As a result, the study found that using random variables for reproductive numbers more realistically predicts the risky tail events, the rarer but more significant superspreader events.

Humans on their own, however, without a Bayesian model for a compass, are notoriously bad at fathoming individual risk.

“People, including very young children, can and do use Bayesian inference unconsciously,” said Alison Gopnik, a psychologist at the University of California, Berkeley. “But they need direct evidence about the frequency of events to do so.”

Much of the information that guides our behavior in the context of Covid-19 is probabilistic. For example, by some estimates, if you get infected with the coronavirus, there is a 1 percent chance you will die; but in reality an individual’s odds can vary by a thousandfold or more, depending on age and other factors. “For something like an illness, most of the evidence is usually indirect, and people are very bad at dealing with explicit probabilistic information,” Dr. Gopnik said.

Even with evidence, revising beliefs isn’t easy. The scientific community struggled to update its priors about the asymptomatic transmission of Covid-19, even when evidence emerged that it is a factor and that masks are a helpful preventive measure. This arguably contributed to the world’s sluggish response to the virus.

“The problems come when we don’t update,” said David Spiegelhalter, a statistician and chair of the Winton Centre for Risk and Evidence Communication at the University of Cambridge. “You can interpret confirmation bias, and so many of the ways in which we react badly, by being too slow to revise our beliefs.”

source: nytimes.com


Please enter your comment!
Please enter your name here