In response to Richard Jones’ book review, I said: So according to Jones, we can’t trust anthropologists to describe foragers they’ve met, we can’t trust economics when tech changes society, and familiar design principles fail for understanding brains and tiny chemical systems. Apparently only his field, physics, can be trusted well outside current experience. In reply, I say I’d rather rely on experts in each field, relative to his generic skepticism. Brain scientists see familiar design principles as applying to brains, even when designed by evolution, economists see economics as applying to past and distant societies with different tech, and anthropologists think they can understand cultures they visit.
Jones' objection to Hanson is indeed unfounded. But there is a more fundamental technological problem with Whole Brain Emulation: http://blog.knowinghumans.n...
1) I think you meant "...since they are made of molecules one can’t emulate them withOUT measuring and modeling at the molecular level."
2) While bio *does* employ modularity, isn't it also true that bio is much more casual about modularity than engineering is? My impression is that in bio, one-to-many and many-to-many mappings of function to subsystem are common; while in engineering, modularity has to date been absolutely essential to e.g. large-team development, efficient maintenance, and efficient enhancement.
There isn't any other option we know of other than to collect signals from the environment, infer that environment from those signals, and then calculate actions to benefit us in that environment.
"Signal Mappers Decouple" (which you linked) seems intended to address intuitions like mine. There you define signal processing: "A signal processor is designed to maintain some intended relation between particular inputs and outputs."
But human cognition (unlike sensation) doesn't seem designed to maintain some intended input-output relation. Instead it serves to optimize the relationship between output and the (probable) environment. How certain are we that the brain accomplishes (or would best accomplish) optimization to the environment through signal processing?
A point that I sometimes suspect em-skeptics of missing is that emulations don't need to be exact. If we discard a given feature as noise, the question isn't whether we maintain perfect fidelity, it's whether we retain substantial functionality.
For instance, it seems possible that the effects of a concussion are mediated by effects most neural models would discard. But being unable to model a concussion would not hinder the em revolution in the slightest.
Does your argument depend on cognition being a form of signal transmission? The signal-transmission model probably applies to sensation, but even perception goes beyond the data. (In Ulric Neisser's phrase, analysis is accomplished by synthesis.)
Adrian Thompson's research into evolving circuits in FPGAs found no digital abstraction layer. The evolved circuit was very sensitive to analog implementation details, to the point of reliance on parasitic circuit elements for correct function.
Typo fixed; thanks.
Jones' objection to Hanson is indeed unfounded. But there is a more fundamental technological problem with Whole Brain Emulation: http://blog.knowinghumans.n...
1) I think you meant "...since they are made of molecules one can’t emulate them withOUT measuring and modeling at the molecular level."
2) While bio *does* employ modularity, isn't it also true that bio is much more casual about modularity than engineering is? My impression is that in bio, one-to-many and many-to-many mappings of function to subsystem are common; while in engineering, modularity has to date been absolutely essential to e.g. large-team development, efficient maintenance, and efficient enhancement.
There isn't any other option we know of other than to collect signals from the environment, infer that environment from those signals, and then calculate actions to benefit us in that environment.
How much you can discard depends on how much you understand. If you understood everything, you could build an AI.
"Signal Mappers Decouple" (which you linked) seems intended to address intuitions like mine. There you define signal processing: "A signal processor is designed to maintain some intended relation between particular inputs and outputs."
But human cognition (unlike sensation) doesn't seem designed to maintain some intended input-output relation. Instead it serves to optimize the relationship between output and the (probable) environment. How certain are we that the brain accomplishes (or would best accomplish) optimization to the environment through signal processing?
Processing is very different from transmission. Processing can go well beyond the data.
A point that I sometimes suspect em-skeptics of missing is that emulations don't need to be exact. If we discard a given feature as noise, the question isn't whether we maintain perfect fidelity, it's whether we retain substantial functionality.
For instance, it seems possible that the effects of a concussion are mediated by effects most neural models would discard. But being unable to model a concussion would not hinder the em revolution in the slightest.
Does your argument depend on cognition being a form of signal transmission? The signal-transmission model probably applies to sensation, but even perception goes beyond the data. (In Ulric Neisser's phrase, analysis is accomplished by synthesis.)
Adrian Thompson's research into evolving circuits in FPGAs found no digital abstraction layer. The evolved circuit was very sensitive to analog implementation details, to the point of reliance on parasitic circuit elements for correct function.
An Evolved Circuit, Intrinsic in Silicon, Entwined with Physics:http://citeseerx.ist.psu.ed...