Hofstadter’s Law

I read Douglas Hofstadter’s new book I am a Strange Loop, which argues that consciousness happens spontaneously after a system of dynamic patterns is sufficiently complex.  Strange loops of self-awareness  existing on multiple levels (as in Godel’s famous proof) create hallucinations of a hallucination, and so an "I" forms.  Anyway, as I often do when reading nonfiction, I read a little bit more about the author, and was struck by Hofstadter’s law: It always takes longer than you expect, even when you take into account Hofstadter’s law (note this is recursive and paradoxical, which is Hofstader’s specialty). This turns out to be pretty well known among programmers where everyone has read Hofstadter’s Godel, Escher, Bach.

As they say, Hofstadter’s Law is funny because it rings true to many programmers, who often work on complex projects that take years to complete.  Clearly an alternative to the Law of Iterated expectations.  Why might people involved in sufficiently complicated tasks–writing a paper, a book, building a deck–generally underestimate their length?  I think the main reason is that goals become self-fulfilling, so any lengthening of a goal time would add to the total time  the way bureaucracies spend the limit of their budget whatever it is.  Just like a group of people, people themselves have multiple goals; to watch tv, to get a project done, to be a better golfer.  A successful goal needs a  bias to compete with your other goals, who probably also have biased homunculus advocating for them in your mind. 

On one level an unbiased expectation is optimal because it allows us to allocate our resources more efficiently.  But there are many cases where this is not true, where a little too much hope and faith actually makes you a more successful person, and more fun to be around.  Just think about how annoying ‘brutally frank’ people are–they are jerks.  Think about the guy who thinks he is a better dancer than he really his confidence actually makes him a better dancer, because part of good dancing is not being self-conscious.  Robert Trivers has pointed out that self-deception is, in moderation, an evolutionary advantage, in that a liar who believes his own lies is a more effective persuader than a lie who knows he is lying, and fundamentally we are social animals trying to convince others to do this or think that.

GD Star Rating
loading...
Tagged as:
Trackback URL:
  • http://cob.jmu.edu/rosserjb Barkley Rosser

    Part of the problem with all of this is the unexpected appearance of inevitable infinite regresses that must be artificially shut off. This was a problem first recognized by Luce and Raiffa, thinking about the problem of bounded rationality.

    So, the problem is in a world of imperfect information, how much effort should we spend on gathering information about the problem we wish to solve? But, ah, there is the higher level problem: how much time should we spend on thinking about how long we should gather information? Which implies yet the next level up, and so forth without limit in principle.

    Well, for most of us it is obvious that one does not want to get too caught up in such loops because one will have very high probabilities of spending too much time thinking about thinking about thinking about… if one lets oneself get too high in the levels of doing so. So, we tend to just artificially cut things off at somewhat lower levels, using rules of thumb or other intuitive mechanisms. We have committees on committees, but I have never heard of a committee on committees on committees, although some of the plannification exercises in some of the more navel-gazing central planning agencies began to resemble this.

    But, getting back to the original problem. Getting caught up in such escalating levels of analysis is more likely for someone who is in the depths of solving a problem (or completing a major project, or whatever). One runs into problems, and then one finds oneself worrying how long or in what way they should solve the problem, and then…

  • http://timmiano.blogspot.com/2007/05/incompletely-understanding-unknowable.html ~Tim Miano Seeks Job

    Incompletely Understanding the Unknowable

    I’m now glad that our search for understanding will never come to an end. Gödel’s theorem ensured there would always be a job for mathematicians.

  • Dagon

    Not directly related to your argument, but beware the word “sufficiently” in arguments or definitions. consciousness happens spontaneously after a system of dynamic patterns is sufficiently complex. is meaningless, as you can just note that any non-conscious system is clearly not sufficiently complex.

    My favorite example of this is the contrapositive to Clarke’s Law: Any technology distinguishable from magic is insufficiently advanced.

  • Roy Haddad

    It’s not entirely meaningless, as it states that consciousnesses that we do know exist are composed of these systems of dynamic patterns, as opposed to something else, and it implies that any way of creating similar patterns (say, mechanical computation) would also create consciousness.

  • eric

    The philosopher David Chalmers was on Blogginheads TV last weekend, and he was a student of Hofstadter. He mentioned that toasters has a certain amount of consciousness. I haven’t smoked pot since college, so he lost me there.

    I don’t understand how a bunch of neurons obeying the laws of physics (thru chemistry) generates my consciousness, yet I’m also certain that my consciousness is intrinsically related to my brain and will extinguish with it. But if you look at the brain as an evolutionary creation with all its deficiencies, you see it as aggressive (predator), a deceiver (camouflage), a persuader (seduction, selling). Getting the math right (unbiasedness) is merely one of several ‘good’ qualities in our brains.

  • http://amnap.blogspot.com/ Matthew C

    I don’t understand how a bunch of neurons obeying the laws of physics (thru chemistry) generates my consciousness, yet I’m also certain that my consciousness is intrinsically related to my brain and will extinguish with it.

    People have been ‘certain’ of a great number of things throughout history, and many of them ended up being provably wrong. It is probably a good thing to be ‘certain’ of as few things as possible, and thereby be open to finding out new information that might modify or expand or even invalidate parts of one’s worldview.

    I like what Jerry Fodor says on this topic:

    Nobody has the slightest idea how anything material could be conscious. Nobody even knows what it would be like to have the slightest idea about how anything material could be conscious. So much for the philosophy of consciousness. . .

  • http://www.weird.co.uk/martin/ Martin Poulter

    The answer you give about Hofstadter’s Law doesn’t seem relevant to the question you’re asking. I agree with your explaination that, if I expect it will take 10 weeks to do something but I’m given 15 weeks to do it, it will take me 15 weeks to do it. Hofstadter’s Law says that, given a task that will take 15 weeks, people will estimate it to take them 10 weeks, even when they think they are being generous. This is logically a different issue.

    Given that this site is about bias, you could have mentioned superiority bias (i.e. thinking, “This would take the average person 15 weeks, but of course I’m better than average, so it will only take me 10 weeks”) or beneffectance effect on memory (=egocentric attribution errors) (i.e. “In the past, this took me 15 weeks, but the delay was due to circumstances beyond my control. When I have control, it will take 10 weeks”.) or some variation of availability heuristic (remembering the days when you powered through the task rather than the days when you were distracted by minutiae).

    Personally, I find arrogant, self-inflated people to be jerks and honest, self-critical people to be better company. I don’t think I’m alone in that. Neither my preference nor your preference counts as a scientific argument.

    As for bias being evolutionarily advantageous (in the context of deception): maybe, but so is violence against out-groups. That doesn’t mean that it’s morally good or socially desirable to have these biases. Your final point seems to be that lying is a more “fundamental” human behaviour than others. That sounds very arbitrary to say the least.