I have said things like: We should expect brain emulation to be feasible because brains function to process signals, and the decoupling of signal dimensions from other system dimensions is central to achieving the function of a signal processor.
"your mind is literally a signal processing system."
My previous post was out of line, I leaped before I looked, I have replaced it with this, for you consideration.
With apologies to physics, let's assume all forces are just different densities of the same thing. The strong forces and the weak forces for the sake of this discussion are different forms of gravity different densities of gravity
In other words "gravity" is also a kind of signal.
I think the problem you're experiencing has to do with how you define a signal. In a digital world the signal is electricity. A switch is either on or off. There's no middle ground. When you talk about coupling and decoupling you're talking about an absolute condition. When you talk about software you can talk about software as being close to or far from the machine – tightly bound or loosely bound to the machine.
That is at least until one sees that (digital) electrical switches are in fact pendulums as well.
So in essence a pendulum is a signal processor. The processing that takes place is Its movement along its trajectory. There is input and output here just not at the level of complexity of the discussion.
Rather than a brain that merely process a signal the brain itself is also a signal. The input and output merely a form of trajectory for an organic machine that is already in motion.
We should expect brain emulation to be feasible because brains function to process signals, and the decoupling of signal dimensions from other system dimensions is central to achieving the function of a signal processor... our standard theories of how physical devices achieve signal processing functions predicts that we can replicate, or “emulate”, the same signal processing functions in quite different physical devices.
Here's an attempt to clarify Bryan's argument. When I do, I find some virtue in it:
To have a perfect (or pure) signal processor, you must decouple signal dimensions from other system dimensions. A properly functioning computer is (while it is properly functioning) a pure signal processor.
Evolutionary pressure will make the brain an increasingly pure signal processor. But, it is unlikely that this process asymptotes at perfect signal processing, for two reasons:
1. The brain has other functions besides signal processing as well as nonfunctional constraints (such as size).
2.Evolution is limited by history: you can't reach every point in design space from where you are now by an evolutionary process.
Therefore, the fact that the brain is "mainly" (an excessively vague term) a signal processor doesn't imply that it can be emulated by a computer. The brain probably has impurities in its execution of signal processing which are essential to its functioning as a signal processor.
Which is to say, we should be able to emulate the signal processing functions of the brain, but we can't do it by emulating part of the brain, as no part of the brain is a pure signal processor.
Same response. I don't see how anyone who has read anything you say about ems would say your argument is based on a metaphor.
Robin's argument that there's no in-principle obstacle to emulating human brains seems sound. If the function is noncomputable, at least according to my limited understanding, it should be subject to estimation as close as necessary (with quantum provisos that probably aren't relevant.)
But maybe there's another fallacy involved--I don't know if it's been named; it's the jump from possibility to probability. In general, if we can give no reasons that an outcome is probable, we conclude it is improbable. Possibility alone isn't enough for anyone to take the em scenario seriously--unless Robin is planning a work of science fiction.
And our standard theories of how physical devices achieve signal processing functions predicts that we can replicate, or “emulate”, the same signal processing functions in quite different physical devices.
Is this true regardless of whether the signal-processing functions are computable? Or is it saying that signal-processing functions are necessarily computable?
I think the problem might be in the expression "signal processing".Computers can do three things, add, subtract, and compare.Computers are unable to literally or actually value anything. That is predetermined in the software and as such can be only token values.
Very little that comes from the hand of man is "complex".Most of it is just complicated.Logic and reason are only "loosely" connected.
I think we forget that "1+1=2" is just an agreement among people. There really is no such thing as 2. While 2,3,4, etc are unquestionably helpful there is only 0 and 1.
Consider how much effort has been given to the God particle.Now consider the definition of a point.No comparison The point is profoundly smaller than any subatomic particle.
This would mean that the vast majority of the universe persists of other than the tiny subset we call matter.
Metaphors arise out of density. Consider high definition and low definition displays. Metaphors can express attributes of principle, but mostly they are surrealisms.
To expand on Kevin Dick's comment on Bryan's post, I think the issue here is that to Robin and probably most readers here, "computer" means "signal processor" or something, but to Bryan, "computer" means "one of those things you buy from Apple, Dell, etc." So to Bryan, "The brain is a computer" can only be a metaphor.
Bryan's two examples are also pretty terrible, because they both still work metaphorically. A horse starved of oats will not go; a car starved of gasoline will not go. The general principle of conservation of energy remains true and illuminated by the metaphor.
Similarly, if your brain coolant system breaks, your brain is pretty rapidly going to stop working. (Of course, your coolant system and your fuel system are the same, so it'll probably stop working sooner because it ran out of fuel. But you can imagine just the coolant system breaking by being somewhere where the wet bulb temperature is high enough that sweating actually raises your internal temperature- and pretty soon your brain will shut down.)
(Reading the comments on Bryan's article, these points have already been made more succinctly there; but I'll make them here anyway.)
Attaboy, Robin. Perhaps this would be clearer if we did a little bit of term-unpacking. It's not so much that brains are "signal processing systems", which sounds like a term of art. It's that what they do is to process signals. They receive inputs from the external world through our sense organs, they make sense of the data, draw correlations, and make plans about how to navigate the world we see. Additionally, our brains do so in a physical way. We're learning more and more over time about how they do so, and there's no indication that there's anything behind it but the atoms, molecules, and organs that we can see, analyze, and replicate to any level of detail that we understand.
Last night, I made essentially the same comment on Bryan's post. Though I used "information processor" instead of "signal processor", my explanation very closely paralleled yours.
This is frightening in an ironic way because it appears that reading your blog has partially uploaded a fairly complex blob of signals/information from your brain to mine :-)
Yes. The proposition that all computation, including thought, can be translated into some universal primitive form (such as Turing machines or recursive functions) has been out there formally for the better part of a century. Mathematicians and logicians and computer scientists and so forth have been looking actively for ways to disprove it, and quantum computing is about the only fundamental difference they've found so far. And quantum computing doesn't mean the translation can't be done in principle, just that it might be (exponentially) more expensive than people once thought.
It seems to me that the proper analogy is not to comparing cars to horses, but something like the first or second laws of thermodynamics (interpreted literally and correctly, not the misstatements of the second law used by creationists and environmentalists). People have been trying hard for a long time to disprove those laws, without success (though with the wrinkle that mass turns out to be energy). The claim that cars are horses is not made carefully or seriously, and disproving it is a moment's work. The claims that energy is fungible and that computation is fungible have been made carefully and seriously for a long time, and have held up very well, so it's unreasonable for Caplan to criticize you for running with the idea.
The evolutionary cause of brains is their signal processing functions. If brains didn't compute functional movement output from sensory input, they wouldn't have evolved. Of course they are signal processors, literally. This is trivial.