26 Comments

At equilibrium, creatures in an ecosystem either contribute to that ecosystem or are considered parasites. What will humans contribute in this future utopia?

Expand full comment

What constitutes "contributing" to an ecosystem? Is every lifeform other than a photosynthesizing plant a parasite?

Expand full comment

I would define it as producing something that is consumed by other species.

In all these utopias, the AI robots do all the producing, and the humans do all the consuming. Given that the robots in these scenarios are assumed to be far superior in intelligence to humans, this just seems hopelessly unsustainable to me.

Expand full comment

Humans already consume other lifeforms in an ecosystem. We're an apex predator.

Expand full comment

I mean, sure. I’m not all that convinced our ecosystem is at equilibrium or that apex predator is all that different in kind from parasite, but your point is taken.

Either way, I don’t understand how in these utopian fantasies people imagine that we will continue to get served by these robots with superior intelligence (not to mention strength, speed, memory, duplicability, not needing to sleep) and retain control over them.

Expand full comment

Why is it humans dominate horses when they are so much faster than us? Or cars & guns when they are more powerful than us?

Expand full comment

Because we are smarter than they are.

Expand full comment

You can only see about 5000 stars in the night sky. So "descendants as numerous as the stars in the sky" should be quite achievable, with a long enough time frame. We just need to avoid this birth rate collapse thing.

Expand full comment

Depends how you count. Some points of light are galaxies.

Expand full comment

"Similarly, a great many futurists try to imagine crazy advanced technology and social institutions in the context of cultural values quite close to their own"

Great observation. Flying cars are a great example of an old prediction which is now technically possible although no closer to actuality. Unless you are living in a post-scarcity world, it will always be hard to justify the added vehicle/fuel expense of going airborne to shave a few minutes off your commute. It is easier to imagine a future with superior urban planning or where we use VR to work remotely somehow. This phenomenon is relatively common if you go back to old futurist predictions. Many are now technically feasible but have no place in our modern world; a surprising amount of them appear to be altogether useless.

Expand full comment

>After all, Bostrom says little about honor, of enjoying the lamentations of those you conquer, or of having “descendants as numerous as the stars in the sky and as the sand on the seashore.”

I really think this is the key insight here. As foreign as these values seem to us moderns, they were evidently crucial to the survival of past societies (i.e., almost certainly not evolutionary "spandrels"). I haven't read Bostrom's book, but if he neglects this perspective I suspect it is because he doesn't fully appreciate the importance of raw violent power in shaping the social/adaptive landscape. That's not too surprising, since Western states often endeavor to conceal that unsightly fact--although one would hope that world class philosophers and scientists would not be fooled.

Expand full comment

I feel like the assumption that creatures like you or I won't be in charge is itself projecting onto AI our features. I am still not at all convinced why those actors would gather power to themselves or act in their interest when opposed to ours absent being designed to do so.

Expand full comment

It is conceivable to me that some future authoritarian society might see it as necessary for their survival to create a powerful army of autonomous machine tyrants, in whose shadow they can at least plausibly persist as slaves.

Of course that assumes that the marginal utility of carbon based intelligence won't collapse to zero (or, more specifically, that future authoritarians have reason to believe as much). I have my doubts that Bostrom's idea of "superintelligence verging on omniscience/omnipotence" is physically realizable. If, hypothetically, artificial intelligence eventually does plateau for centuries at roughly the scale of humanity, that may very well lend authoritarians the confidence needed to construct and unleash such an army (as an alternative to being eradicated by a hostile adversary).

Expand full comment

Yes, I think that suggestion ignores fundamental limits on computation. Issue is most problems tend to either naturally have low computational complexity (eg you can predict things without too fast asymptomatic growth) or are like the weather (computational costs rapidly grow unsustainable) and AI isn't magic. It can't beat fundamental limits on computational complexity.

Expand full comment

What does a future with superintelligent AI look like? It's a fascinating question but I think almost by definition the answer of any single person will be biased and unreliable. I would much rather see a book with chapters written by 20 diverse people across different fields.

Expand full comment

Issac Asimovs work is robots with creaky 1940s political stances

Expand full comment

The original Foundation novels are like this. Interstellar travel is possible, but relations between men and woman are pretty much the way they were in 1950. No woman holds any position of industrial or political power.

Expand full comment

And furthermore, in the earliest Foundation stories (written during WWII) the mathematical calculations for interstellar travel are done by hand, by spaceship pilots selected for their mathematical abilities. This was months before ENIAC, surely one of the most egregious examples of poor forecasting in all of science fiction.

Expand full comment

Frank and later Brian Herbert explore this topic in the Dune series

Have both you and Nick read the entire series ? Many done take sci fi writing seriously

Expand full comment

The distant world of Dune features aristocratic houses fighting each other with knives, and no computing machines. Herbert deliberately made it resemble the past, shifting it away from scifi and closer to fantasy (hence its influence on Star Wars).

Expand full comment

I'm not actually sure about this. Would the world today seem that foreign to a Roman from 2000 years ago?

I bet many of them would be disappointed by how little has changed.

I also think billions of years can pass in shocking stagnancy if some person or group manages to get totalitarian control over the world, as we are currently in dire threat of, and which may have already de facto happened.

Expand full comment

Imagine what Greek philosophers would think of the anti-intellectual leadership of this planet!

Expand full comment