In January I claimed:
An extraordinary claim is usually itself extraordinary evidence … I would be very unlikely to make such claims in situations where I did not have good reasons to think them true. The times to be more skeptical of unlikely claims are when there is a larger than usual chance that someone would make such a claim even if it were not true.
Eliezer responded, and then I outlined a formal model. I now have a working paper. In it, I consider the effect of people being organized into a reporting chain, such as up the levels of an organization, or from researcher to referee to editor to reporter to editor and so on. The new interesting result:
When people are organized into a reporting chain, noise levels grow exponentially with chain length; long chains seem incapable of communicating extraordinary evidence.
For example, imagine disaster frequency was inverse in deaths, that we faced one big enough to kill us all, and that nature gave someone a signal about the casualties, a signal with a (lognormal) standard deviation of a factor of ten. But imagine the news of this signal passed through a chain of seven noisy people, each of whom adds 10% to the signal standard deviation. While nature’s signal itself is clear evidence of an unprecedented disaster, the signal that appears at the chain’s end gives a median estimate of about one death; our warning is lost.
This suggests we reduce the length of our reporting chains, such as in efforts to make organizations more innovative by reducing layers of management:
Perhaps we need only the first few levels of reporting the discretion to adapt a claim to context; reports at further levels could be something like “Fred said that our best estimate of disaster deaths is one million." Another alternative might be to use prediction markets to shorten reporting chains; the person who saw nature’s signal trades in the market, anyone who wants to correct that market price for context does so via trades, and everyone else is referred to the market price.
This discussion has many resonances with Eliezer's discussion in the next post of the "decay" of knowledge with transmission from generation to generation. In that case we are more or less forced into a linear chain with no way (in the case of e.g. God speaking of Moses) of going back and rechecking the phenomenon. So the decay is structurally inevitable.
Science of course promotes short chains by encouraging replication and black boxing. Nobody has to "trust" the PCR works -- they get PCR materials and equipment and depend on it every day in their lab. All of molecular biology is a distributed daily replication of PCR. And this pattern holds for essentially all major scientific results, although less dramatically.
At the risk of getting boring, this is another piece of (what should be) a general theory of judgment aggregation. The community around this blog seems to be stepping smartly in that direction.
Dagon, no doubt it is easier for networks to communicate info that can be cheaply and repeatedly generated at will by many dispersed people. Unfortunately the most interesting extraordinary claims are rarely of this sort.