Open Thread

Here is our monthly place to discuss Overcoming Bias topics that have not appeared in recent posts.

GD Star Rating
Tagged as:
Trackback URL:
  • Unknown

    Given all the talk about voting, why don’t we hold a vote on who has overcome the most bias?

    Among the commenters, I propose Caledonian as the one who is the most biased. I am presently undecided as to who is the least biased, but I may vote later.

    Among the posters, there seem to be only two major candidates (although it looks as though this may change.) Consequently I vote for Robin Hanson as the one who has overcome the most bias– although I am still looking forward to the disagreement case study involving him and Eliezer.

  • Cyan

    There’s been frequent reference to Eliezer’s having experienced a dramatic change of point-of-view sometime in 2003. I’d be interested in reading more about that, or following links to same if it’s already been posted somewhere.

  • Silas

    Repost of my comment, now that there’s a post for which it’s appropriate:

    Robin_Hanson’s idea for the bet about the repeated study on women vs. men talkativeness expired yesterday and it looks like the results confirmed the previous study. Comment?

    I can’t seem to link directly to the contract, as there’s some javascript predicate, but if you click soon you’ll get it here.

    (FWIW, I wanted better experimental protocols instead of a simple repeat, so I didn’t bet.)

  • LG

    Cyan, Eli has written about that on his personal site — it was his “Bayesian Enlightenment.”

  • Overcoming Cryonics

    How, despite EY’s impressive exploration of epistemology and biases, he so overconfidently and dogmatically believes in the cryonicism / life-extensionism / trans-humanism / singularity religion.

  • Cyan

    LG, doesn’t seem to have a narrative of Eliezer’s Bayesian Enlightenment, which is really what I’m after. Eliezer has said he’ll get there eventually (in the comment thread to Reversed Stupidity). Eliezer’s enlightenment was more than just an understanding of Bayes’ Theorem — in the thread I linked above, Tom McCabe notes that Eliezer was posting about that in 2001. So what was it exactly?

  • Ben Jones

    “…cryonicism / life-extensionism / trans-humanism / singularity religion.”

    I’m not a fanatic of any of these, OC – far from it – but I object to the lumping of all four together and the plonking of ‘religion’ at the end. This is just my opinion, but I think that says a good deal more about yourself than a response would about Eliezer. Whereas every religion is, by definition, prescriptive, the Singularity seems to me fairly open-ended and even inclusive. Religions tend to glorify the ancient and the mystical – the Singularity refers to the future and cooler, cleverer (but not mystical) versions of what we have today. I wouldn’t call it science, but at least it makes some predictions. What religious predictions have come to pass? The other three ‘religions’ aside for the moment, how can you convince me that Singularitarians belong to a religion? Please don’t confuse faith and belief in your response; I believe the sun will rise tomorrow, but not based on faith.

  • Nick Tarleton

    OC, can you offer any actual arguments that EY is overconfident, or that the four things you mention are religions?

  • Tiiba

    “How, despite EY’s impressive exploration of epistemology and biases, he so overconfidently and dogmatically believes in the cryonicism / life-extensionism / trans-humanism / singularity religion.”

    Can you say something at least slightly more like an argument? What do you disagree with? Why?

    Or are you flame-baiting?

  • Doug S.

    The Singularity occurred in 1876, when Thomas Edison invented the industrial research laboratory. We’ve been riding the asymptote ever since. 😉

  • Stagyar zil Doggo

    I could not find any info on the Intrade site as to which study they used to close the Adult Talkativeness contract. Do you have a reference or link?


  • Caledonian
  • Caledonian

    OC, can you offer any actual arguments that EY is overconfident,

    You’re joking, right? We’re talking about the man who bragged that his grandparents would outlive the Milky Way.

    He’s either grossly overconfident, or prone to grossly-exaggerated rhetoric.

  • Unknown

    Regarding the particular statement about his grandparents, Eliezer seems to have retracted it. It would certainly show that he was overconfident at the time.

    However, many of his suggestions regarding probabilities even now strongly suggest overconfidence. For example, he suggested a probability of over 50% for the chance of success of cryonics. Robin Hanson says that he expects a better than 5% chance, which suggests that he is thinking of something like 10%. This disagreement shows that either Robin is overly skeptical, or Eliezer is overconfident. Since we have these two possibilities, if no one comes up with evidence for one or the other, we should assign a probability of at least 50% to the position that Eliezer is overconfident. And since his overconfidence in the past is evidence (makes it more probable) that he is overconfident now, while there is a quite high correlation between past and present overconfidence, there is a very high chance that Eliezer is overconfident.

  • burger flipper

    EY is also under 30, prone to claiming a Vulcan thought process (sorry, “I wonder how many people are trapped up there” was in everyone’s top 3), and capable of of being moved to poetry (and then moved to dissecting his own poems). Under 30 explains the others to my satisfaction. His stuff’s always a good read, and more power to him.
    No one gets out of this thing alive.
    But at that age everyone should want to.

  • Stirling Westrup

    Just a minor request. I usually read “Overcoming Bias” via RSS feed, and there isn’t a byline contained in the text of the feed. What that means is that I never know who I’m reading. Lately I’ve just simply been assuming that all text here is written by Eli, which is not a bad rule of thumb, but can lead to confusion at times.

  • Stagyar, see this article.

  • Daniel Humphries


    The other possibility is that they are both overconfident, Eliezer only moreso. I suppose it’s also a logically allowable possibility that they are both underconfident, too.

    On a different, note:

    OC’s repeated baiting in the comments of numerous posts (mainly repeating the same charge without evidence, as far as I can tell), is wearing on me, too.

    However, I would indeed like to see this concern addressed. I’ve seen it addressed here and there on posthumanist websites, but I sometimes feel that people proclaiming that the coming tech-revolution will be universal in nature should do some more traveling in Latin America or Africa and be reminded how many people still lack telephones or even electricity. This may not be a very forceful argument, but I often feel I detect a note of rather naive exuberance from people who seem to spend all their lives propped up in front of a computer in Silicon Valley or Oxford or wherever. Is there a Coddled Life Bias in our bestiary?

  • Mason

    Has there been a post about credential bias?

    I don’t know if this is really a bias, but it seems like it may be. The same outrageous statement made by two people one with a PhD and one without would receive very different responses. Same goes for questions asked, in the one case it’s a stupid question not worthy of a response, in the other, it’s an interesting new take.

    Of course, people with credentials have proved themselves, and discounting the un-credentialed is an easy way to save time.

  • Stirling Westrup: I see a byline – perhaps it’s your RSS reader? I’m subscribed to:

  • You know the more passionate about things you are, the more controversial you get, I am sorry but I have thought a lot about it and personally I find that human chauvinism obscures our vision most of the times all I am going to say is that please “think different”, when you talk about such topics you need to do so and Mr. Eliezer certainly brings this to the table

  • Reading this post made me immediately think of this blog. It involves taking an “outside view” of what is most likely and assuming the opposite of your original position and imagining what it would take to persuade you to change your mind.

  • tcpkac

    There’s a body of psychology research which shows a certan set of stable biases are a part of mental health. Specifically, happy, mentally healthy, and effective people have significant positive biases in the areas of self esteem, estimation of degree of control over the environment, and optimism about the future.
    By contrast, the set of people who have objectively validated self-assessments in those areas are those who are moderately to severely depressed.
    A good, although not recent, synthesis paper is Taylor & Brown, ‘Illusion & Well-Being’ :
    The ideas have been developed & popularised by Martin Seligman, he of ‘Positive Psychology’ fame.
    So, would the team rather be biased, happy, healthy, and effective, or unbiased, depressive, and inhibited ?

  • tcpkac, if you were offered a pill that would make you overly self-confident (and hence happy – ?) forever at the cost of your rationality, would you take it? Bear in mind that being happy doesn’t make you less wrong.

    I reject your dichotomy. I would rather be unbiased, happy, healthy, and correctly confident in my own abilities. As far as I know, no study has ruled out this combination. Self-deception correlates positively with perceived happiness? Very possibly, but I’m not changing my stance on rationality based on that.

    If you believe that rationality (unbiased-ness?) and happiness are mutually exclusive, or even at odds, you’re in trouble straight away.

  • tcpkac

    Hi Ben, read me more carefully. I am referring to a specific set of biases, experimentally proved to correlate with mental health and happiness. Nowhere do I claim that Taylor & Brown’s, or Seligman’s, work demonstrates that bias in general correlates to health and happiness !

  • From here:

    The asteroid – which carries the rather dull designation 2007 TU24 – will pass by at a distance of 538,000km (334,000 miles), just outside Moon’s orbit.

    Well they were going to call it Alan, but hell, there was already an asteroid with that name! Cheap shot, I know, but science reporting in daily newspapers makes me want to break things. At least this one gets seven paragraphs in before mentioning Hollywood….