First thing we do, let’s kill all the lawyers. King Henry VI People have long been suspicious of “middlemen,” e.g., traders, lawyers, bankers, salesman, marketers, managers, and politicians. For millennia, most people have suspected such middlemen of being mostly social parasites, and many “Utopian” reforms have planned to eliminate them. Economists have faced an uphill battle arguing that middlemen usually serve important functions. Among intellectuals, engineers and physical scientists find it especially hard to appreciate roles other than designing, building, maintaining, fueling, and distributing physical goods.
Comparing intelligence, and consciousness, I've come to the conclusion that extreme rationality isn't that great.
The film 'Equilibrium' (starring Christian Bale) tells the story of a totalitarian state where all superstimuli have been banned (thus high art is banned and emotions are not allowed).
In the finale of the film, Bale fights his way into the office of the 'father' (ruler of the totalitarian state). He is amazed to find... you guessed it... the office of the main bad guy is saturated with high art and super-stimuli! Watch the final fight scenes, this is very cool, Bale's character uses 'Gun Kata', the ability to dodge bullets based on extrapolating opponents moves in advance. It tuns out that emotions are the source of his power all along.
Equilibrium Finale: Emotion beats Reason
The moral of the story is clear. The coming Singularitarian battle for the dominion of the universe will be decisive. Consciousness (Sentient AI wireheads/super-stimulants) will beat Intelligence (Non-sentient RPOPs).
I wax eloquent about the ideas about consciousness from this book in my post. Among other things, I consider that considering intelligence without consciousness is a lot like what we are getting as we try to do Artificial Intelligence. That is, we are finding it is a lot easier to build a dead brain than a live one.
Consiousness is what integrates different utility functions, enabling them to be compared and combined. It does this by categorization (grouping and forming analogies between concepts) in order to enable mediation betwwen sub-agents. This grounds symbols and quantifies raw information in order to create a unified narrative of past and future. Consciousness at root is causality itself, it is a third-order (relational) causality that is everywhere present to some extent.
The conceptual "I" and consciousness are two entirely different realities. . . Consciousness is foundational and prior to any notions of "I" or "me" that arise.
You are looking at an input, I'm talking about output. Squabbling is only one of many things that can make coordination hard.
> Cells only successfully coordinate in a rather limited sense. The coordination of cells has literally taken eons to achieve
I totally disagree, but maybe we aren't talking about the same thing. I say that the coordination of cells within an individual organism is 100% successful because there is zero conflict or selfishness. Any cell or any number of cells will perform total self-sacrifice in order to transmit the genome of the individual to which they belong. Thus, one goal, zero squabbling. Admittedly, cancer does represent the breakdown of this process, but if you pay enough energy in preventing it you can minimize your expected fitness loss due to cancer. Thus cancer is not something of great biological importance in general, though it could make for an important fitness difference between two lineages with otherwise similar genomes and similar fitness. My descendants would eventually replace yours if I could eliminate more cancer risk using less energy. But cancer is not a huge obstacle to having life at all, or to having multicellular life at all.
Perhaps you are using some broader sense of "successfully coordinate."
I'm suspicious of a particular subset of middlemen, the verbally gifted. I host something of a defense of middlmen by Thomas Sowell, "Are Jews Generic?"
Robin, you have used a kind of utilitarianism to argue against deontological libertarianism. Don't many of the same arguments apply to deontological truth-seeking? If it is good for people to get what they want, and they want self-deception, isn't it good for them to self-deceive?
Munger seems to equate need with demand, which is roughly equivalent to mistaking hunger for wealth.
> I have a much better idea of what is likely to be preserved in the distant future than of how much different folks today value those things.
Maybe that should be a topic for consideration in some future analysis you do. Lay out what you think will be preserved, and what won't, and people can opine about how much they value what you think will be preserved. In particular, it seems that it would be useful to flesh out your notion of functional analogues of human traits, as there seem to be "layers" of analogy, some of which might be considered more significant than others.
A Tunic is a functional analogue of a business suit, and I don't think that this layer of analogy is axiologically relevant. A futuristic efficient automated trading program with AI is a functional analogue of a trader in a business suit, but people might consider that we have now jumped to an axiologically relevant layer. (I might opine that) we would lose little if all businesspeople shifted from suits to tunics, but we would lose more if all businesspeople were replaced by specialized AIs.
It seems that this issue is a source of persistent disagreement between SIAI/Eliezer and yourself.
Cells only successfully coordinate in a rather limited sense. The coordination of cells has literally taken eons to achieve, and it is pretty slow to adapt to changing conditions. The degree of adaptation to local conditions embodied by ants and bees is trivial compared to the coordination that humans manage today. In terms of rapid adaptation of coordination to changing conditions, humans are doing a far better job.
It's not that coordination is hard, it's that coordination is hard for humans. Ants, honeybees, and individual cells of multicellular organisms seem to have solved it. It's hard for evolved beings to coordinate on a level in which selection occurs, but on levels that selection does not occur, it's a solved problem - animals with cancer don't reproduce very much.
Well, eBay is a middleman, and it's very useful!
If you liked Blindsight, another book that deals with similar problems from a different angle is Permanence, by Karl Schroeder. Excellent, and hard science fiction in the same vein. http://www.amazon.com/Perma...
Allow me to recommend George Ainslie's book Breakdown of Will, which is mostly a theory of akrasia but also contains some intriguing thoughts on the origin of the self from the need to coordinate different mental models with conflicting desires and non-economically-rational discount curves.
Roko, we now coordinate far more than is suggested by the phrase "just something that happens occasionally." I have a much better idea of what is likely to be preserved in the distant future than of how much different folks today value those things.
> Our genes encode selfish strategies for coordination
To put it another way: coordination is just something that happens occasionally as a result of optimizing behavior.
In the future, optimizing entities might coordinate a lot less or a lot more than we do now. The full range from the singleton to the burning of the cosmic commons has been aired as possible.
I would like to see a post on your detailed anticipations about how selfish optimizing behavior and different kinds of coordination and signaling mechanisms in the future will change us yet leave something axiologically important preserved. My impression is that you don't have a good idea of what of importance would remain, but I could be wrong.