Greg Egan is one of my favorite science fiction authors, and his latest novel Zendegi (Kindle version costs a penny) seems to refer to this blog:
“You can always reach me through my blog! Overpowering Falsehood dot com, the number one site for rational thinking about the future—”
That is Nate Caplan, a self-centered arrogant rich American male nerd, who creepily stalks our Iranian female scientist hero Nasim Golestani, an expert in em (brain emulation) tech. Nate introduces himself this way:
I’m Nate Caplan. My IQ is one hundred and sixty. I’m in perfect physical and mental health. And I can pay you half a million dollars right now, any way you want it.
Nate wants to pay so he can be the first em:
It’s very important to me that I’m the first transcendent being in this stellar system. I can’t risk having to compete with another resource-hungry entity; I have personal plans that require at least one Jovian mass of computronium.
Nasim naturally despises Nate.
So is Nate Caplan inspired by me, by my famously libertarian colleague Bryan Caplan, or by Eliezer Yudkowsky, who was my co-blogger back when Egan wrote this book?
Consider that Egan’s book also contains a Benign Superintelligence Bootstrap Project, clearly modeled on Eliezer’s Singularity Institute:
Their aim is to build an artificial intelligence capable of such exquisite powers of self-analysis that it will design and construct its own successor. … The successor will then produce a still more proficient third version, and so on, leading to a cascade of exponentially increasing abilities. … Within weeks—perhaps within hours—a being of truly God-like powers will emerge.
This institute is backed by an arrogant pompous “octogenarian oil billionaire” Zachary Churchland. To say more here, I’m going to have to give spoilers – you are warned.
Both Churchland and Caplan end up succumbing to illness and being cryonically frozen. No other characters even consider cryonics, apparently seeing it as distasteful, and revival as a very long way off if possible at all. Everyone with any sense in the book treats the superintelligence folks as complete idiots, and after years spending billions:
The sum total of their achievements had amounted to a nine-hundred-page wish-list dressed up as a taxonomy, a fantasy of convenient but implausible properties for a vast imaginary hierarchy of software daemons and deities.
While Nate Caplan’s “entire world view had been molded by tenth-rate science fiction,” he is at least somewhat reasonable. Nassim ends up choosing to team with Nate, who introduces Nassim to the key missing tech she lacked:
Side-loading is the process of training a neural network to mimic a particular organic brain, based on a rich set of non-intrusive scans of the brain in action.
Together they make side-loads that successfully emulate parts of human brains, such as an athlete’s sport skills, in order to populate virtual reality games with human-like players. Nasim starts to have second thoughts when she hears that Nate will try to use side-loads to displace half a billion human jobs. A league of powerful terrorist hackers then distrupts Nasim’s games, and issues demands:
It’s unethical to create conscious software that lacks the ability to take control of its own destiny. … That’s where we draw the line: no higher functions, no language, no social skills. … If you want to make something human, make it whole. If you want to enable people to step from their bodies into virtual immortality, perfectly copied, with all their abilities preserved and all their rights intact… go ahead and do it, we have no problem with that. … But if you want to put humanity into a cheese grater and slice off little slavelets to pimp to the factories and the VR games, well … then you’ve got a war on your hands.
Nasim isn’t convinced at first:
None of the side-loads can be conscious in the human sense. They have no notion of their own past or future, no long-term memory, no personal goals.
But at the very end of the novel Nasim is persuaded to join this hacker league to try to make ems illegal. What changes her mind? She tries to make a more complete side-load of Martin Seymour, who is dying of cancer, so the copy can guide Martin’s orphaned son. After extensive testing Nasim concludes:
Either Virtual Martin felt nothing, or he felt exactly what he claimed to feel: love for his son, acceptance of his limitations, and contentment with the purpose for which he’d been brought into existence.
The copy is quite intelligence and articulate – so far, so good. Then Martin tests his copy, named Jack, by pretending to be his son Javeed, inside a virtual reality game.
“You fucking worthless piece of shit!” He tore the metal helmet from his head and threw it on the ground; his face was contorted with anger and disgust. “Baba, it’s just a game,” Martin pleaded. “Are you my son?” Jack raged. “Is this what that fucker Omar did to you?” “Baba, I’m sorry—” Martin stood his ground as Jack walked up to him and started flailing impotently with his fists at Sohrab’s giant body. Jack sank to his knees. “Is that what I taught you? You couldn’t help yourself, even when he begged for his life?” He clawed at the dirt. “What am I, then? What am I doing here?” He struck his head with his fists, distraught. “Baba, no one’s hurt, it’s just a game,” Martin insisted. He shared Jack’s revulsion at what they’d both witnessed; he had known full well the feelings that his act would provoke. But he was sure he could have held his own response in check for Javeed’s sake; he could have stood back from his anger and found some gentler rebuke than this.
So because in a situation of extreme emotional stress one copy didn’t have the self-control that its original believed (perhaps falsely) that he would have in such a situation, all ems must be illegal. After all, this copy, horror, used swear words! Even though we have many millions of quite conscious and social animals enslaved to humans, it is an abomination for any such creature to have once been part of a human. No evidence is offered than any side-loads had suffered yet, or that any substantial fraction of future ones would suffer – the mere possibility that some of them might suffer at the hands of profit making firms seems to be enough.
Egan presents reasonably rich characters, and has a reasonable grasp of the relevant technical issues, both regarding emulations and super intelligence. Egan presents a plausible scenario of scientific and business progress – while I think full emulations would mostly displace partial ones when available, partial ones could certainly keep a place.
However, Egan seems unreasonably idealistic when it comes to his imagined hacker league – they are organized implausibly early and well compared to the businesses they oppose. And Egan’s ethics seem an incoherent muddle to me, though alas that doesn’t make him worse than average. He seems to have a species purity obsession, disgusted by any making of non-human minds from human minds. And his presentation of pro-cryonics, pro-superintelligence, and especially pro-em characters seems a bit mean-spirited to me, no matter which of us inspired those characters.
I may be years late, but for the sake of posterity, I have to point out that TJIC is referring to a (false) theory very briefly considered by a character in a book by Greg Egan. I can hardly believe that he read about a conspiracy theory briefly considered by a character in a fictional book which contained such a wide range of conflicting ideas and immediately took this conspiracy theory to be a reflection of the actual beliefs of the author of the book. That he immediately stopped reading out of distaste for the imagined personal beliefs of the author and then posted here a complete misrepresentation of the plot of the book which he did not read is even more beyond belief. It appears, however, that this is exactly what he did.
I also find Hanson to be much smarter in the socially useful way. It may be that Eliezer was 'smarter' and figured out that he did not need to study, that he can just talk people into giving him money to 'work on AI' . But as of now, Eliezer did never study anything. Autodidact is what you get when you take ambitious person and strip them of opportunity to pursue education. People dropping out by their will are not autodidacts, they just don't want to study.