26 Comments

There's a well known bias (the causes are controversial) that makes it hard to make prices move below 5% or above 95%

"Well known" to you, maybe. Thanks for the information. Any advice on where to go for more "tips for Prediction Market newbies"?

Expand full comment

Rolf Nelson wrote: my uninformed hypothesis is that the Foresight Exchange prices are currently "sticky", that people place[d] buy orders in 2005, and then abandoned their accounts, rather than reduce their buy orders every year [...]; and that there are not enough currently-active traders able and willing to compensate for this. As an example, USAGeo still had Buy orders at 1, even days after the Dec 31 2007 deadline had passed and the event had not happened.

The facts are inconsistent with your suggestion.

There's a well known bias (the causes are controversial) that makes it hard to make prices move below 5% or above 95%. But there are plenty of cases where the prices move slowly over time reflecting event that don't take place. Draft, and NucW are two picked arbitrarily. Even USAGeo shows a clear trend over time, even if it didn't stabilize at 0. The last 20 trades (covering the last three months of the claim) were all at 1. (Why would anyone offer it for less?)

Expand full comment

Well, the meaning of "30%" is probably "halfway between the outer limit of statisticallhy insignificant(which would be 10%) and 50/50." So, not really likely (more than 50%), but not so unlikely as to beinsigificant and hence, behaviorally ignorable.

Of course, whether that really is "the probability," if that even has any meaning, is quite another matter.

Expand full comment

Nominull, correct in everything you say. My point was about people's (and the media's) reaction to the publication of a nice scientific number like 30%. Granted, somewhere in my mind is a probability (thought not necessarily a percentage) for, well everything. However, I wouldn't want to put a percentage on each one of these. What if I did so, and this percentage stuck, and began to bias my thinking? My point, like the first para of the original post, referred to our automatic acceptance of numbers like this.

Expand full comment

Briar, I'm afraid you didn't understand my point. I'm not saying that estimating 30% is simply a way of saying that the odds are somewhere between 1% and 99%.

If we consider factual questions such as "Is the extraterrestrial hypothesis the correct explanation for UFOs?", the calculated probability, if there was one, would certainly be an extreme, which we presume would be extremely low. (Consequently the estimated probability should also be low, but not as low as we suppose a calculated probability would be, to take into account the possibility of serious miscalculation, such that the true extreme was on the other side.)

But if we consider a question like "Will this coin end up heads or tails after I flip it," we cannot get an extreme, because the thing can easily go either way.

Now when we are not calculating a probability but estimating a probability, we are in reality considering a question like this: "Will my expectations be right or wrong?" And this is a question much like the question about the coin, because I can easily be mistaken. So the estimated probability should not ordinarily be an extreme.

As for the 30%, it signifies that in situations where I have this degree of expectation, I claim that the thing I expect usually happens about 3 out of 10 times. So in determining whether people who claim a 30% chance of a nuclear detonation in the next 10 years are right or not, we have to consider not only whether it actually happens, but also whether other things happen that they say have a 30% probability.

Even so, the difference between 30%, 90%, and 10% is clear. If I say the odds are 90%, I claim that in 9 out of 10 of cases where I have this degree of expectation, the thing happens, and if I say the odds are 10%, I claim that the thing happens once in ten times.

Expand full comment

Ben Jones, you sound like a frequentist. Any probability we place on the U.S. getting nuked in the next 10 years other than zero or one is guaranteed to be "incorrect", inasmuch as the term applies: either it will happen or it won't. When we say there's an X% chance of something, we mean, well, it's a little tricky to explain and I'm pretty sure Eliezer Yudkowsky wrote something about it but I don't know where, but basically we mean in a very specific way that we are X% sure that thing will happen. And, you know, you have to be *some* percent sure it will happen. Even if your guess is underinformed, you still have a guess.

Expand full comment

The 30 percent estimate appears not only in the Lugar survey but also in Matthew Bunn, A Mathematical Model of the Risk of Nuclear Terrorism, Annals of the American Academy of Political and Social Science 2006; 607; 103. Accurate attribution and successful retribution are unlikely if the weapon, as is usually supposed, were produced from stolen uranium. It does not require much technical sophistication for a group to produce a ~10kt device, if they have sufficient resources to obtain the uranium. (Bunn and Wier. Terrorist Nuclear Weapon Construction: How Difficult? The ANNALS of the American Academy of Political and Social Science 2006; 607; 133; Zimmermann and Lewis, Bomb in the Backyard, Foreign Policy, Nov 2006) I agree with Robin that real money prediction markets would be useful to assess the probabilities of various catastrophes, including nuclear terrorism.

Expand full comment

briarandbramble beat me to it. I should probably read ALL of the comments before making my own next time.

Expand full comment

Just a quick question since I'm not an expert at this sort of thing at all... how does one arrive at an estimate of the probability of a nuclear attack occurring? Other than just making it up off the top of your head I cannot conceive of a way to get a real number representing the chance it will happen. 30$ sounds high to me, but so does every number above "I don't have a damn clue". It all just seems like stabs in the dark.

Expand full comment

Unknown, it sounds like then I should interpet "30%" as polite shorthand for "some unspecified value between 1% and 99%". That's really the best an estimated probability can do I guess. In this context, I think 1% and 99% are basically the same value, and both mean, "yes, this might happen".I wish I understood what an estimated probability was mathematically.

Expand full comment

However low the threat of a nuclear attack on the US, it might well be a more serious threat on balance (in terms of expected human and economic cost) than say the threat of terrorists putting bombs on the New York subway. The latter is enormously more likely, but it would take a lot of subway bombs to be on a par with finding a 'missing' Israeli nuke and activating it in the middle of Baltimore.

Of course, if you're thinking only in terms of costs to people and property, terrorism ought to have much less salience than it does. I wouldn't be surprised if the expected number of people to die in terrorist attacks in the US over the next 10 years, say, is less than the number who we expect to get shot dead by police, and both these numbers will be far less than the number who are killed by career criminals with economic motives. Even that number will be dwarfed by the number who die by their own hand or by accidents that are largely of their own making, and much larger still the number who die due to poor nutrition and/or inadequate access to healthcare. If the government diverted resources away from anti-terrorism and towards these other problems, how many lives could they save and at what price?

But maybe there's a disutility to terrorist attacks that somehow trumps the mere body count. Two components here: psychology, and politics. Terrorist attacks may inflict more psychological damage on survivors and/or relatives of those killed than the usual criminal violence, because it's harder to understand why it happened. Also, the constant news coverage may exacerbate the trauma, and even upset people with no direct link to the attack. Second, when it comes to people killed in robberies and so on, people have low expectations of what the government can accomplish, so it doesn't look too bad for the government if the level of this fails to decrease as much as it could. However, if a terrorist attack is perceived as preventable, then it will inflict a major loss of prestige on the government if it occurs, because it has such prominence as a one-off event. This applies to some extent to voters (though as terrorist attacks also trigger a 'patriotic sheep' response, this may not actually be bad for the politicians), but more importantly, it would diminish the US's reputation abroad: the US's massive political influence abroad is largely a function of how 'strong' the country is perceived to be, so any sign of 'weakness' is bad for the US. The stock market likewise responds to such signs of 'weakness' on the part of government, perhaps excessively so.

So it could be rational, from a cynical perspective, for governments to pour large amounts of resources into fighting terrorism at the expense of say ordinary policing. However, I suspect that if you think like this, it's preferable to *pretend* to invest a lot in prevention of terrorism, while the money actually goes on pork. This is because what counts most is giving the impression of caring about terrorism in the media, and as long as nothing too serious happens you'll get away with it completely, whereas special interest votes generally have to be bought with real money (including indirectly, for instance if you use tariffs and regulations to create a playing field that's biased in favour of the special interests, it's effectively giving them money *and* destroying some of the economic prosperity of the country, but it's the concentrated benefit to the special interests that matter politically).

Incidentally: dirty bombs are ideally suited to terrorism (aka psychological warfare), not because they're an effective way of killing people, but because of people's distorted perception of probability. Let's say a huge dirty bomb goes off in the middle of Paris, contaminating an area with a population of 2m. Suppose that the intensity is such that this causes an additional 100 cancers over the next 30 years, of which 20 are fatal. That's not so bad, you might say. But the point is that the attack has apparently increased the chance of getting cancer for 2m people, however marginally, and that will make them unhappy and afraid. Tourists won't want to go to Paris, because although the cancer risk of staying in Paris for a week would probably be much less than the cancer risk of the flights there and back, Paris would have a big 'CONTAMINATED!' sticker on it in people's minds, unlike planes. Locals will be reluctant to work in central Paris, for the same reasons, and residents will pressure the government to help them pay for accommodation elsewhere as the value of their property is wiped out. Compared to the actual death toll, the indirect economic cost would be colossal.

Expand full comment

Briar, there is a difference between calculated probabilities and estimated probabilities. For the reasons given by Eliezer in "Einstein's Arrogance," calculated probabilities are usually very high or very low, as you suggest (although this doesn't prevent the probability of a coin flip from landing heads from being 50%). But estimated probabilities should very rarely be something like 0.0001 or .9999, and much more reasonably are things like 5%, 25%, 60%, and so on. This is necessary in order to calibrate human estimation; if one says .001 whenever one believes that if the probability could be calculated, it would be 1 in a 1000, one will turn out to be wrong often, often enough that one's odds would be much closer if one said 10% rather than 1 in a 1000.

The odds of a nuclear weapon being used cannot be calculated but only estimated, and thus the estimate must be somewhere in the middle, or it will be grossly overconfident or underconfident.

Expand full comment

I wonder if this survey is the source of the figure:http://www.spacedaily.com/n...

Expand full comment

Dirty bombs are much easier and thus more likely. Most terrorists are not the sharpest blades in the drawer. Give them an easy job, fire up their religious or political zeal, and they might get the job done.

More complex jobs generally require the help of professionals--who often have ties to or receive help from established governments or quasi-governmental organisations. Attribution of blame for a nuclear attack will likely be made to stick to persons connected with an existing government--even if putatively carried out by non-government actors.

Catastrophe prediction is almost impossible to get right. Peak oil may happen anytime between now and 2040. The climate may either cool or warm between now and 2100. There may be an all-out nuclear war on the planet in the next 20 years. Or aliens my bombard the planet with a swarm of comets in the first step of a massive terraforming operation.

Expand full comment

I am always skeptical of estimated probabilities that are between 10% and 90%. In general, it is much common that the evidence will yield a probability of something like .00001 or .999. To me it always seems like quite a coincidence if all the evidence in favor and against just happens to balance out so evenly that you end up with a number like 30%. So, when that happens, my confidence in the estimate drops, because there's such a strong bias towards pretty numbers.

Expand full comment

I agree these %'s seem way too high. However the real issue is that there isn't a place which aggregates all the available predictions and probabilities, and is understandable to the mass market?

I know that Bo Cargill once mentioned that Google had considered putting prediction market results in search results. That could be an amazing resource. I guess the main problem is that many prediction markets lack liquidity to give accurate predictions (like the Ideosphere example).

Expand full comment