Monthly Archives: December 2006

Normative Bayesianism and Disagreement

Normative Bayesianism says that you ought to believe as you would if you were an ideal Bayesian believer and so believing is what it is to believe rationally. An ideal Bayesian believer has (1) beliefs by having credences, where a credence is a degree of belief in a proposition; (2) has a Prior = a complete consistent set of credences (capitalized to avoid confusing priors = a person’s credences with Priors = a plurality of complete consistent sets of credences), that is to say, has a credence function from the sigma algebra of propositions into the reals such that the credence function is a measure that is a probability function; (3) changes his beliefs on the basis of the evidence he has acquired by updating his credence function by the use of Bayes’ theorem.

Much of the earlier discussion about the rationality of disagreement and the requirement of modesty was advanced on the basis of the claim that Bayesian believers cannot rationally disagree. But there are different versions of what precisely that claim might be.

Strong Bayesian Agreement: Ideal Bayesian believers who have common knowledge of each others opinion of a proposition agree on that proposition.

Moderate Bayesian Agreement: Ideal Bayesian believers who have rational Priors and common knowledge of each others opinion of a proposition agree on that proposition.

Weak Bayesian Agreement: Ideal Bayesian believers who have a common Prior and common knowledge of each others opinion of a proposition agree on that proposition.

Continue reading "Normative Bayesianism and Disagreement" »

GD Star Rating
loading...
Tagged as: , ,

Vulcan Logic

Sometimes reading the posts and discussions here I am reminded of some of the most interesting characters in science fiction: the Vulcans of Star Trek. Vulcans are depicted as having two main traits: they are extremely logical, and they are unemotional. These two characteristics are generally presented as if they are related, or even synonymous. Vulcans make decisions based on logic, in contrast to humans who frequently make decisions on emotional grounds.

When we try to overcome our biases and see the truth clearly, are we aiming to become Vulcans? Is Bayesian another word for Vulcan?

In some ways it does seem true. We talk about trying not to be influenced in our thinking by our hopes and fears, but to reason dispassionately and logically. Many or most of our biases are emotionally based and satisfy emotional needs. Vulcans have perfected the art of overcoming these sorts of biases. In many of our critiques here of bias, I mentally hear the voice of Mr. Spock chiding: "You are behaving most illogically."

One problem with the Vulcan emphasis on logic above all is that it is not clear what motivates Vulcans. Logic helps us to see what is true, but it cannot tell us what we ought to do. Indeed, although Vulcans in the stories are successful within the quasi-military structure of Star Fleet, where orders come from above and give them straightforward guidance as to what their goals should be, they seem to be at something of a loss if thrown into an ambiguous situation, separated from authority and forced to set their own goals.

This suggests that we do not want to become true Vulcans, but rather to retain a core of human emotionality surrounded by a shell of logic. Our emotions, our needs and our drives set our goals. Logic then helps us to achieve those goals. Logic is the means, but emotional satisfaction is the end.

I must admit that this sounds a little too pat. It is far from clear that we can separate our mental functions so nicely. Even Vulcans are depicted as suffering from rare episodes of near psychotic irrationality and emotion, as years of suppressed emotions seem to explode uncontrollably. I wonder if our efforts to channel and control our emotions may lead to similar catastrophic failures.

GD Star Rating
loading...
Tagged as:

Benefit of Doubt = Bias

One dictionary defines "to give the benefit of the doubt" as

To believe something good about someone, rather than something bad, when you have the possibility of doing either.

That is, assume the best.  This may be better than assuming the worst, but honesty requires you to instead remain uncertain, assigning chances according to your evidence.   Does this mean we should stereotype people?  After all, M Lafferty commented

To make assumptions about an individual based on a stereotype is wrong, even if the stereotypical view is broadly accurate.

To the contrary, I say honesty demands we stereotype people, instead of giving them "the benefit of the doubt."  Bryan Caplan has emphasized to me that most stereotypes are on average accurate:

Obviously, every stereotype has exceptions; stereotypes are useful because they are better than nothing, not because they are infallible.

Continue reading "Benefit of Doubt = Bias" »

GD Star Rating
loading...
Tagged as:

See, But Don’t Believe

Friday’s Science reported that one in four published journal articles has misleadingly manipulated images.

Some biologists become so excited by a weak signal suggesting the presence of a particular molecule that "they’ll take a picture of it, they’ll boost the contrast, and they’ll make it look positive" … scientific journals, concerned about a growing number of cases of image manipulation, are cracking down on such practices with varying degrees of aggressiveness.  At one end of the spectrum is the biweekly Journal of Cell Biology, which for the past 4 years has scrutinized images in every paper accepted for publication — and reports that a staggering 25% contain at least one image that violated the journal’s guidelines.   That number has held steady over time …

Most journals are reluctant to devote much staff time and money to hunting for images that have been inappropriately modified.  Vanishing few are emulating the Journal of Cell Biology. … and its two sister journals, which have a dedicated staffer who reviews the roughly 800 papers accepted by all three each year.   Science‘s screening is principally designed to pick up selective changes in contrast and images that are cut and pasted. … Since initiating image analysis earlier this year, Science has seen "some number less than 10," or a few percent at most.  … the difference might be due to … the fact that [the Journal of Cell Biology‘s] staffer … is now unusually experienced at hunting for modifications.

The cost-effectiveness of this one staffer in disciplining an entire field of research seems enormous.  We could clearly increase research progress overall by replacing a few more researchers with such staffers.  The fact that no other journals do anything close suggests either that we have a serious coordination failure, or that research progress is not a high priority.

GD Star Rating
loading...
Tagged as: ,

The Future of Oil Prices 2: Option Probabilities

< ?xml version="1.0" standalone="yes"?> < !DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd">

A few days ago I showed a plot of oil futures prices, and Robin made the point that it would be useful to see information about variance as well. For each futures contract there are a set of options which can be used to deduce information about how much uncertainty the market sees in future prices. For example, the commodities price yesterday for December 2010 oil was $67 per barrel. But for $2 you could buy a call option with a strike price of $100. This will have a value of P – $100 if at the end of 2010 the oil price P is higher than $100 (and will have no value if the oil price is lower than $100). Therefore if oil goes up to $105 you will more than double your money; if it hits $122 you will make ten times your investment. Clearly the market does not view these possibilities as very likely, or the option would not be as cheap as $2. Below the fold I will describe and illustrate a simple technique for deriving probability estimates for price targets merely by glancing at tables of option values. I find it useful and practical in terms of getting a quick overview of how likely the market judges various possibilities.

Continue reading "The Future of Oil Prices 2: Option Probabilities" »

GD Star Rating
loading...
Tagged as:

Resolving Your Hypocrisy

Self love is more cunning than the most cunning man in the world.  … Hypocrisy is the homage vice pays to virtue.    La Rochefoucauld.

Humans are hypocrites.  That is, we present ourselves and our groups as pursuing high motives, when more often low motives better explain our behavior.   We say we invade nations to help them build democracy, rather than for revenge or security.  We say we marry to help our partner, rather than to gain sex or security.  We say we choose our profession to help others, and not for prestige or income.  And so on.

Comedians live by ridiculing such hypocrisy, but "cynics" who complain without such wit and style are despised.  In contrast, we are attracted to the innocent who naively believe our hypocrisies.

Noticing the hypocrisy in others usually makes us feel morally superior.  After all, we are know we are not hypocrites; "I can look inside myself and and see my sincerity."  But eventually experience and intelligence force some of us to face the likelihood that we are no different.   At this point we can resolve our hypocrisy two ways: we can start really living up to our high ideals, or we can admit we don’t care as much as we thought about those ideals .   

Continue reading "Resolving Your Hypocrisy" »

GD Star Rating
loading...
Tagged as: , ,

Academic Overconfidence

Another important bias in academic consensus is overconfidence.  Even in the hardest of hard science, Henrion and Fischhoff showed in 1986 (ungated for now here) that published error estimates for fundamental constants of physics were seriously overconfident.   Looking at 306 estimates for particle properties, 7% were outside of a 98% confidence interval (where only 2% should be).   In seven other cases, each with 14 to 40 estimates, the fraction outside the 98% confidence interval ranged from 7% to 57%, with a median of 14%. 

Last week’s New Scientist described a dramatic example with policy implications (ungated for now here):

In July 1971, Stephen Schneider, a young American climate researcher at NASA’s Goddard Space Flight Center in New York, made headlines in The New York Times when he warned of a coming cooling that could "trigger an ice age". … The US National Academy of Sciences reported "a finite probability that a serious worldwide cooling could befall the Earth within the next 100 years". … It is often claimed today that the fad for cooling was a brief interlude propagated by a few renegade researchers or even that the story is a myth invented by today’s climate sceptics. It wasn’t. There was good science behind the fears of global cooling.  … 

All this raises an alarming question. If climatologists were so wrong then, why should we believe them now? As those who played a part in the cooling scare now readily admit, those early studies were based on flimsy data collected by very few, often young, researchers. In 1971, when Schneider’s paper appeared, he was instantly regarded as a world expert. It was his first publication.  Today, vastly more research has been done into how and why climate changes. The consensus on warming is much bigger, much broader, much more sophisticated in its science and much longer-lasting than the spasm of concern about cooling.

This is too pat an answer.  Yes, we have more data now, but the issue is our tendency to claim more than our data can support.  I’m not saying global warming is wrong, just that we should be less confident than academics suggest.   

GD Star Rating
loading...
Tagged as: ,

Gnosis

In honor of Christmas, a religious question.

Richard and Jerry are Bayesians with common priors. Richard is an atheist. Jerry was an atheist, but then he had an experience which he believes gives him certain knowledge of the following proposition (LGE): “There is a God, and he loves me.” Jerry’s experiences his knowledge as gnosis: a direct experience of divine grace that bestowed certain knowledge, period, and not conditioned on anything else at all. (Some flavors of Christianity and many other religions claim experiences like this, including prominently at least some forms of Buddhism.) In addition to believing certain knowledge of LGE, Jerry’s gnosis greatly modifies his probability estmates of almost every proposition in his life. For example, before the gnosis, the Christian Bible didn’t significantly impact the subjective probabilities of the propositions it is concerned with. Now it counts very heavily.

Richard and Jerry are aware of a disagreement as to the probability of LGE, and also the truth of the various things in the Bible. They sit down to work it out.

It seems like the first step for Richard and Jerry is to merge their data. Otherwise, Jerry has to violate one rule of rationality or another: since his gnosis is only consistent with the certainty of LGE, he can either discard plainly relevant data (irrational) or fail to reach agreement (irrational). Richard does his best to replicate the actions that got the gnosis into Jerry’s head: he fasts, he meditates on the koans, he gives money to the televangelist. But no matter what he does, Richard can not get the experience that Jerry had. He can get Jerry’s description of the experience, but Jerry insists that the description falls woefully short of the reality — it misses a qualitative aspect, the feeling of being “touched,” the bestowal of certain knowledge of the existence of a loving God.

Is it in principle possible for Richard and Jerry to reach agreement on their disputed probabilities given a non-transmissible experience suggesting to Jerry that P(LGE)=1?

GD Star Rating
loading...
Tagged as: , ,

Ads that Hurt

In some earlier posts I talked about the idea that advertisements can be privately profitable for firms but still be socially harmful due to uninternalized negative effects on their targets.  In the comments, Glen Raphael asked me for an example.  Here are a few of my favorites.

1. JIF peanut butter.  The slogan "choosy moms choose JIF" is famous in marketing circles for having been extremely effective.  The reason, of course, is that the implied corrolary to the slogan is "crappy negligent moms who don’t care about their kids give other brands of peanut butter."  The harm here is that it gets moms into the habit of thinking that the place to concentrate their efforts to be better moms involve choice of peanut butter, instead of things that might actually work.

2. Disney.  One could write a whole book about how horrifying Disney is (I think maybe someone even has).  But my specific example is a commercial for Disneyland where the camera is in real tight on the face of an ecstatic awestruck looking kid.  After a few seconds, the parents chuckle knowingly and say something like "Timmy, this is just the entrance, let’s go inside the park."  The idea is that Disneyland isn’t just a fun place to eat junk food and go on roller coasters and stuff, it’s a source of wonder; the very essence of childhood.  The harm here is obvious.

3. Any one of a million beer commercials.  Beer commercials overwhelmingly involve attractive women.  This is either because the advertisers want you to believe that drinking their beer will actually improve your chances with attractive women, or at least they want you to associate drinking their beer with being as attractive to them as you wish you were.  I think it’s pretty clear that encouraging young men to think of women as ornaments that go nicely with getting loaded is not a recipe for subsequent success in matters of the heart, not for the men and certainly not for the women.

I could go on.

GD Star Rating
loading...
Tagged as: ,

Gifts Hurt

Two weeks ago Alex Tabarrok at Marginal Revolution called me a Scrooge for pointing out that "helping" professions don’t help more.  So this Christmas day, let me Scrooge again by pointing out the dark side of gifts.   It is not just that gifts can be worth less than they cost; the problem goes deeper.  In Friday’s Washington Post, Charles Krauthammer explained:

The roundsman is the guy who, with the class huddled at the bed of a patient who has developed a rash after taking penicillin, raises his hand to ask … whether this might not instead be a case of Schmendrick’s Syndrome … The point is for the prof to remember this hyper-motivated stiff who stays up nights reading journals … the roundsman, let’s call him Oswald, ignores at his peril, is that this apple polishing does not endear him to his colleagues, … The general feeling among the rest of us is that we should have Oswald killed. … There’s always an Oswald.  There’s always the husband who takes his wife to Paris for Valentine’s Day. Valentine’s Day? The rest of us schlubs can barely remember to come home with a single long-stem rose. What does he think he’s doing? And love is no defense. We don’t care how much you love her — you don’t do Paris. It’s bad for the team.

Gift-giving is in part a contest, to show how much more you know and care about someone, relative to others.  And what that someone gets, in part, is having everyone see how loved they are, relative to others.  If you succeed and make yourself look good, you make other givers look worse by comparison.  And if your recipient looks loved, other recipients look less loved by comparison. 

"All is fair in love and war" they say, and this sort of love is a lot like war; when you gain, others lose.  But while we usually feel at least a little bad about the harm we cause in ordinary war, we are smugly proud of the harm we cause in this war of love that is gifts.   

The world may gain some benefits from people feeling they can trust their associates.   But even so, I’d guess most gifts produce a net harm.   

Enjoy your spoils of war this Christmas day.  :)

GD Star Rating
loading...
Tagged as: