44 Comments

I may be years late, but for the sake of posterity, I have to point out that TJIC is referring to a (false) theory very briefly considered by a character in a book by Greg Egan. I can hardly believe that he read about a conspiracy theory briefly considered by a character in a fictional book which contained such a wide range of conflicting ideas and immediately took this conspiracy theory to be a reflection of the actual beliefs of the author of the book. That he immediately stopped reading out of distaste for the imagined personal beliefs of the author and then posted here a complete misrepresentation of the plot of the book which he did not read is even more beyond belief. It appears, however, that this is exactly what he did.

Expand full comment

I also find Hanson to be much smarter in the socially useful way. It may be that Eliezer was 'smarter' and figured out that he did not need to study, that he can just talk people into giving him money to 'work on AI' . But as of now, Eliezer did never study anything. Autodidact is what you get when you take ambitious person and strip them of opportunity to pursue education. People dropping out by their will are not autodidacts, they just don't want to study.

Expand full comment

Yah, anti-free-market scifi authors are everywhere. Anyone have any suggestions for some recent scifi that isn’t anti-free-market? I have been having trouble finding it.

What? Science fiction might have some anti-libertarian authors, but it is by far the most libertarian-leaning genre of literature. How many other genres have something like the the Prometheus Awards? There are tons of libertarian sf writers and a good amount of sf featuring anarcho-capitalist and minarchist utopias.

Expand full comment

Personally I like Michael Swanwick's future fiction. He posits superintelligent AI that got really, really sick of being asked to route porn and cat images. They want to kill us, painfully. Most of it was shut down long ago, but there's always the danger that something whiling away the centuries in sleep mode will accidentally be reactivated.

When this happens, whatever you do, don't ask it a question. It is sick of our questions.

Expand full comment

Robin of course is a top-class economist, physicist etc, important breakthroughs in the field of prediction markets etc,

Eliezer by contrast...ah... well, he did succeed in making a name for himself in the field of 'Harry Potter fan-fic'.

Expand full comment

gwern, do you say this because Eliezer likes to talk about his IQ (which has not been tested by a reputable psychometric test at anything near his 99.99+ percentile claim), or because you think his IQ is higher than Robin Hanson's? Robin's work-product is blatantly superior to Eliezer's, so other than the whole "narcissists are generally more persuasive" angle, I'm not clear on why you would imply such a thing.

Expand full comment

BTW - I'm about to begin trying to get publishers interested in my own novel, about novel ideas (ha ha) and transhuman possibilities; I intended from the beginning to challenge the accepted wisdom of the desirability, or lack thereof, of remaining a plain old-fashioned non-augmented human. I wonder if there will be difficulty here. Assuming my writing is of acceptable quality, will the departure from the norm be enticing, or off-putting? I guess we'll see. (I may end up just self-publishing for the Kindle. Those books tend to be awful, but it's a start.)

Expand full comment

I don't know anything about this author either, and others appear to disagree with your assessment of him; but whether or not Egan is as you describe, I think this is a fairly prevalent problem in science fiction, and fiction in general.

Fantasy, for example, is full of "transhumanistic" ideas, though the obvious difference is that transcendence would be achieved by magic in this case; but I've yet to read a single fantasy novel wherein wanting to become more than human is considered a *good* thing. The Aesop, to borrow a term from TVTropes, is always that Immortality is Bad; Power is Bad; Godhood is Only Desired by Murderous Villains and the Life of a Simple Farmer is Always Enough for Any True Hero (or GODMVLSFAEATH for short).

It's very frustrating. "It allows someone to imagine radically different times and places, but wrap up the book feeling completely smug with all of the pre-conceived notions and beliefs that the reader started with." This quote sums it up very well; I was going to wonder "aloud" why the "Strange is Bad" moral appears so damn often, but I guess this explains it. "Hey, you're right in your comfortable old beliefs - and you don't have to worry yourself about the possibility of becoming something more, because you wouldn't want to anyway!"

Expand full comment

Jamie_NYC, let me help you: http://www.google.com/searc...

Expand full comment

If your transhumanist dream comes true (I hope it doesn't), in 1000 years no one will give a shit about what any human has achieved. In any case, no one will care more than we care about which chimpanzee managed to peel a banana using an oddly shaped rock.

Expand full comment

You should have stopped at, "I don’t know anything about this".

Expand full comment

I regret to say some SF fans still haven't given up the dream of a benevolent planned society under the benign guidance of SF fandom. (This explains the popularity of stories featuring the Second Foundation, Lensmen, or the Psychology Sevice.)

Expand full comment

I was purely talking about the affordability of transhuman "upgrades". Not about welfare for copies (who would contribute to the economy themselves anyway). "The ability to "transcend" should not be determined by wealth" was the airtight mora stance. This is about the first part of this blog post, not about the copies in the second part.

Expand full comment

Your argument is that it's immoral for wealth to be determined by being lucky in the birth lottery*.

His argument is that that argument becomes absurd in a world where one transhuman decides to create 4000 descendants and another only creates 2, and then transfer the same amount of resources to divided equally by their descendants.

Since that would basically mean that you would have think it would be moral to transfer the wealth from the undeservedly rich 2 descendants to the undeservedly poor 4000 descendants.

* It's also pretty annoying that you're assuming the validity of an ethical position that's far from universally accepted as a "morally airtight stance".

Expand full comment

"Isn’t that just being smart?"

Nope, since for effectively everyone that means retreating to a tiny little niche nobody else cares about, in order to preserve your ego. It also leaves a ton of improvement both in terms of accepting some risk of loss in return for greater pay-offs, as well as a lack of character development from not always being right or best.

Think about someone having a moderate talent in whatever area and then chosing to only spend time with others less talented in order to shine. That guy isn't going to develop anywhere near his maximum potential, he's going to have a severly inflated sense of self-worth and is going to be lacking in all sorts of experiences that teach people that their opinions really aren't particularly infailable.

I'd consider Charles Stross and David Brin pretty good examples of moderately bright guys, convinced of their own genius in all sorts of areas where they lack both education and experience.

"SF authors might just be more aware than others of one of Niven’s Laws"

SF guys seldom strike me as socially astute. And most of them really do use their characters as puppets for themselves and the strawmen they fight.

And no, the part about flawed characters being a problem is repudiated by virtually every other genre of fiction where authors do just fine with more recognizably human characters. I really just doubt that they could write characters as well as authors of other genres.

Expand full comment

Could you please rephrase that? I'm not sure what you are trying to say here.

Expand full comment