Overcoming bias at work for engineers

While I am only an interested hobbyist when it comes to the theory and practice of overcoming bias, it looks like I will be gifted with the opportunity to explore this interest as a side project in my day job at a large tech company.  Specifically, to help design and deliver one or more classes in what I think of as the self-help side of behavior economics – what are the common human biases, and how can we work around them in our everyday lives?

So I thought I’d take advantage of the collaborative filtering aspect of blogging and ask you for thoughts about the most common biases that affect engineers at work, and pointers to information on the best ways of learning to overcome them.  Preferably at a fine granularity – blog posts rather than full books.  An example of the type of approach I’m looking for is Andy Hunt’s talk (and upcoming book) "Refactoring Your Wetware", which is about how the brain works, how engineers misunderstand it (focusing on the logical serial processing and ignoring the intuitive parallel processing), and how that applies to being a programmer.  Sadly it is not yet available as a book or online video, but I’ve requested a copy of the talk he gave at work so perhaps I can fix that.

I’ll post my own model for what I see as the key realization and skills in another entry, but I’d love to hear your thoughts on the subject.  And if my investigations prove fruitful and I successfully develop some material, in a few months, I may even be able to put up some of the materials.

GD Star Rating
loading...
Tagged as:
Trackback URL:
  • Dave

    I would start by mining the various books with the words “Anti-Patterns” in the title. While these books are in general not all that useful for programmers (good programmers already know most of what’s in them, bad programmers probably never will), they provide handy checklists of ways that software projects fail. I would imagine that most of these can be boiled down to determine the fundamental biases underneath them, usually some variant of endowment effect, confirmation bias, or bandwagon effect.

    Random thought: “agile programming” is basically a set of neuro-linguistic hacks for short-circuiting endowment effects.

  • http://yootles.com Daniel Reeves

    Evidence-Based Scheduling (built in to the FogBugz bug-tracking system). Automatically tracks and corrects for systematic biases in programmers’ time estimates.

  • http://www.tristram.squarespace.com Tristram Brelstaff

    One source of bias that I consciously try to avoid is performing quick preliminary investigations. I have found these are especially dangerous if there is going to be a significant gap between the preliminary and the final investigation. During this gap my mind tends to inflate the reliability of the preliminary conclusions, making it much harder to get to the truth in the final investigation.

    I first came across this effect many years ago, when plotting a chart for a variable star I was planning to observe. I identified the general area of the star on an atlas and noticed a star marked with a “V” (indicating variable) near there. I was just about to confirm this identification by precise measurement when my mother came in and told me to clear the table because she wanted to lay out the cutlery for dinner. I quickly pencilled a ring round the candidate star and packed my atlas and papers away. After dinner, when I got them out again, my mind had solidified this candidate into ‘the’ star and I drew up my chart accordingly. It took me only a week to suspect that something was wrong, but it wasn’t until several years later when I analysed all my observations that I finally realised that I had observed the wrong star

    In my work as a software engineer, I first came across this effect when I was given the task of enabling the data and instruction caches in an embedded system. It was near the end of the day so I just glanced through the manuals and noted that the caches were enabled by setting a particular bit pattern in a register. I scribbled a few notes in my log book, including what I though was the hex version of the bit pattern that would do the job, intending to check this more thoroughly the following day. But the next day I was pulled off onto another job and didn’t return to the cache job until a few weeks later. By then I had forgotten about the need to check the bit pattern and just used the value from my logbook. It wasn’t until a few weeks later when I attempted to confirm the forecast speed-up resulting from the enabling of the caches that I discovered I had got the bit pattern wrong (I had used something like 0x0008008 instead of 0x80008000).

    On another occasion, I was analysing a diagnostic data dump after the above embedded system had crashed. A quick glance suggested one possible cause but I didn’t have time to do a proper analysis then. When I came back to the investigation, I just followed that possibility without doing the proper analysis. I wasted a whole day before I realised that I had made a simple error in my quick glance analysis and the cause was elsewhere. After that, I wrote a script to automate the initial analysis of the diagnostic data dumps. Automation can be a good way of avoiding bias.

  • http://rolfnelson.blogspot.com Rolf Nelson

    Most of the useful heuristics are going to be useful for *all* professionals, but if your management won’t fund it without the “specifically for engineers” angle, then so be it.

    Software engineers tend to:

    1. Have below-average social skills, compared with others at their professional pay level. For example, this argues that “illusion of transparency” should be included in the curriculum. Usual caveats: this is only on-average, don’t take offense, yadda yadda. Note that I am an outlier who is exempt from this tendency, as are my friends, colleagues, and any other potentially angry people who happen to read this blog. (Phew, that was close.)

    2. We frequently have to quickly estimate minor-task completion times, because the decision of whether to add a functional point or fix a bug often depends on task completion times. Perhaps the Planning Fallacy and Overconfidence could be included in the curriculum.

    BTW, see Monash for a resource that appears to have actually tested different ways of teaching critical thinking.

  • Doug S.

    Go read this book. That’s my advice.

  • http://jeffreyellis.org/blog/ Jeffrey Ellis

    I’ve added an entry to my blog to respond to the topic of engineers and biases. You can read it here:

    http://jeffreyellis.org/blog/?p=40

    I would also like to echo Dave’s earlier comment about “Anti-Patterns”. These are not just for engineering, necessarily, but they very nicely express the kinds of dysfunctional thought patterns and behaviors to be avoided for those involved in technical projects.

    Jeff

  • http://entitledtoanopinion.wordpress.com/ TGGP

    On the topic of engineers, a study shows they are prone to becoming terrorists.

  • Mark Nau

    One that I often see at work is the unexamined assumption that the user of a product is similar to the engineer in ability, preferences, usage pattern, etc. Hence it is common to get command-line-only tools for artists and the like.

  • J Thomas

    TGGP, I see no reason to rule out selection bias in that study. They looked at lists of terrorists, and the lists were likely to include only the more important terrorists found. Then they threw out the names they couldn’t find out anything else about. Lots of room for biased sampling here.

    It could be true, but this evidence is not particularly believable.

  • http://pragprog.com/titles/ahptl Andy Hunt

    My Refactor Your Wetware book that you mention is now available in beta from our website. More to come soon!

    /\ndy

  • Pingback: Critical Thinking for Engineers | The Thinker