7 Comments

Feels like comparing the complete works of Shakespeare to the dictionary. You can quantify them as being similar in length, but unless you're looking for a doorstop, you're missing the point.

Expand full comment

So when I have a file on my hard drive, and half a dozen backups with different dates on my external hard drive, and a copy on my flash drive, and a couple of copies on burned CDs, do all of those count as separate bits? On one hand, I suppose I'd grant that that achieves parity of comparison. On the other, I'm not sure I see why multiple copies of the same information count as "more information."

If I made a copy of this post, and saved it as a text file, and then copied that text file a million times, would you call that a million times as much information? I'm not sure how meaningful a sense of "information" that is.

Expand full comment

I'm not so sure.HVD will store multiple terabytes have a transfer rate of 100 megabytes a second. You could keep quite an encyclopedia of cached answers on one of those...

Expand full comment

Although it is debatable how much computation a single neuron can carry out, a single computer instruction is not "vastly more complex"; essentially what you are referring to is an emulation penalty (it's like saying that that a Z80 is vastly more efficient than a pentium, since the pentium has thousands of times more transistors, yet it must run at 50% load in order to execute Z80 machine code in real time.) It is similarly difficult for computers to execute the fundamental operations of neuron-based systems (see e.g. http://www.neuron.yale.edu/... as it is for a human brain to perform addition.

Expand full comment

There is a copy of the 3 billion DNA base pairs (although only ~1% code for proteins) in each of the 10-50 trillion cells (except for things like red blood cells, which don't have nuclei. This gives 10^23 bits, so we're counting the info in each cell separately.

Expand full comment

The 6.4 × 10^18 instructions per second that humankind can carry out on its general-purpose computers in 2007 are in the same ballpark area as the maximum number of nerve impulses executed by one human brain per second (10^17).

Could be, but the comparison is deeply flawed. A computer-chip instruction is vastly more complex than a nerve impulse; indeed the phrase "execute a nerve impulse" is rather vague. Computer instructions are things like "add two numbers" or "store this number in that address"; the equivalent operations in a human brain require many, many nerve impulses. Conversely, high-level things like "recognise a face" are done by brains in a way we don't yet understand algorithmically; whether it requires more or fewer nerve impulses than instructions in the same operation on a computer is not clear.

Expand full comment

Why "a human adult"? Why not a fertilized human ovum, which is to a first approximation genetically identical to the adult it may one day become? Are we counting the DNA in the nucleus of every single cell as separate information? That's something like trillionfold redundancy, isn't it?

Expand full comment