More Bits

You can get info by listening to general broadcasts, by asking others specifically, by remembering previous related info, or by figuring it out yourself on the fly, either using general slow methods or fast context-specific methods. Improving tech seems to increasingly favor asking specifically for info over listening to broadcasts. It also favors remembering stuff, and especially asking electronic assistants. For such assistants, tech increasingly favors figuring stuff out on the fly, especially via hardware dedicated to a specific purpose, over remembering or asking.

All this is because the world’s ability to crunch bits has grown fast, much faster than our ability to store or talk, and bidirectional talk has grown much faster than broadcast talk. Even so, the raw capacity of a single human body outshines it all:

During the period from 1986 to 2007 … [world-wide] general-purpose computing capacity grew at an annual rate of 58%. The world’s capacity for bidirectional telecommunication grew at 28% per year, closely followed by the increase in globally stored information (23%). Humankind’s capacity for unidirectional information diffusion through broadcasting channels has experienced comparatively modest annual growth (6%). … The per capita [computing] capacity of our sample of application specific machine mediators grew .. [at] 83% … per year. …

The 6.4 × 1018 instructions per second that humankind can carry out on its general-purpose computers in 2007 are in the same ballpark area as the maximum number of nerve impulses executed by one human brain per second (1017).  The 2.4 × 1021 bits stored by humanity in all of its technological devices in 2007 is approaching an order of magnitude of the roughly 1023 bits stored in the DNA of a human adult. (more)

GD Star Rating
Tagged as: ,
Trackback URL:
  • William H. Stoddard

    Why “a human adult”? Why not a fertilized human ovum, which is to a first approximation genetically identical to the adult it may one day become? Are we counting the DNA in the nucleus of every single cell as separate information? That’s something like trillionfold redundancy, isn’t it?

    • There is a copy of the 3 billion DNA base pairs (although only ~1% code for proteins) in each of the 10-50 trillion cells (except for things like red blood cells, which don’t have nuclei. This gives 10^23 bits, so we’re counting the info in each cell separately.

      • William H. Stoddard

        So when I have a file on my hard drive, and half a dozen backups with different dates on my external hard drive, and a copy on my flash drive, and a couple of copies on burned CDs, do all of those count as separate bits? On one hand, I suppose I’d grant that that achieves parity of comparison. On the other, I’m not sure I see why multiple copies of the same information count as “more information.”

        If I made a copy of this post, and saved it as a text file, and then copied that text file a million times, would you call that a million times as much information? I’m not sure how meaningful a sense of “information” that is.

  • The 6.4 × 10^18 instructions per second that humankind can carry out on its general-purpose computers in 2007 are in the same ballpark area as the maximum number of nerve impulses executed by one human brain per second (10^17).

    Could be, but the comparison is deeply flawed. A computer-chip instruction is vastly more complex than a nerve impulse; indeed the phrase “execute a nerve impulse” is rather vague. Computer instructions are things like “add two numbers” or “store this number in that address”; the equivalent operations in a human brain require many, many nerve impulses. Conversely, high-level things like “recognise a face” are done by brains in a way we don’t yet understand algorithmically; whether it requires more or fewer nerve impulses than instructions in the same operation on a computer is not clear.

    • Josh Burroughs

      Although it is debatable how much computation a single neuron can carry out, a single computer instruction is not “vastly more complex”; essentially what you are referring to is an emulation penalty (it’s like saying that that a Z80 is vastly more efficient than a pentium, since the pentium has thousands of times more transistors, yet it must run at 50% load in order to execute Z80 machine code in real time.) It is similarly difficult for computers to execute the fundamental operations of neuron-based systems (see e.g. as it is for a human brain to perform addition.

  • candy

    I’m not so sure.
    HVD will store multiple terabytes have a transfer rate of 100 megabytes a second. You could keep quite an encyclopedia of cached answers on one of those…

  • Feels like comparing the complete works of Shakespeare to the dictionary. You can quantify them as being similar in length, but unless you’re looking for a doorstop, you’re missing the point.

  • Pingback: Recomendaciones « intelib()

  • Pingback: Overcoming Bias : Trust Govt More?()

  • Pingback: Why Aren’t Watson-like Computers Already Practicing Medicine? | John Goodman's Health Policy Blog |

  • Pingback: More Bits | Overcoming Bias |