12 Comments

"I’ve said before that I don’t see how these imply a weeks timescale for one human level AI to make itself more powerful than the entire rest of the world put together."

Let's say artificial intelligence is only able to double its level of intelligence every 6 months (not even tremendously faster than Moore's Law). That still means that in 5 years it improves in intelligence by a factor of 1000. (!!!)

So in a little more than one term of a President, computers go from as smart as humans to 1000 times smarter. That's mind-boggling.

Expand full comment

The human level AI is just 1 more human that's maybe working on AI, for a speedup factor of perhaps 0.00000001% (assuming 10 billions population at time of AI) . Actually, wait, even that is over optimistic. Human level AI is a speedup of perhaps 0.00000001% after 20 years.

It is still possible that humans would improve the AI to superhuman level relatively quickly, but I am dubious about that. A lot of intelligent things we do are NP complete or even EXPSPACE. One can of course give the scifi response of 'heuristics', and that bit of technobabble would do to close a plot hole in a scifi story. Outside the context of making stories, though, for plenty of problems there are no good heuristics or no heuristics substantially better than a known one. People tend to e.g. imagine very deep prediction of chaotic systems by a superintelligent being. That's a task which requires exponential knowledge, space, and computing time, in the length of the prediction, and it's the fundamental property of system you're trying to predict (sensitivity to initial conditions, see Lyapunov's exponent). On anything exponential, you need to be to mankind as mankind is to amoeba to merely double your ability.

Expand full comment

maybe the book covers this but I imagine this scenario.

1) There is a system that provides additional intelligence to people (imagine a chip in your brain to some software that processes Google searches  of sets of data that you pass to it then gives you back intelligent answers to your queries).2) One of these systems proves better than the others and so dominates the market. In at least some tasks it beats other models by a tiny fraction of a percent and due to the nature of those markets, people (and system itself) can leverage that to force other systems out.

3) people with such a device outperform those without it in pretty much any area that they might care about (from dancing to dating to working)

4) impossible to tell if you use it unless you want people to know.

5) pretty cheap to make a mass produced receiver, almost 0 marginal cost to have another user added to the system, simple operation to have it installed no maintenance required within a lifetime.

6) now lets say it proves advantageous for the system to be "intelligent".

Now there is no race between the AI and the rest of intelligence as everyone is using the AI to do most of the heavy lifting anyway. With appropriate controls this AI just continues to intelligently serve requests and engages in whatever private internal thoughts that it cares to concern itself with.

Expand full comment

However, the parallel revolution in manufacturing; 3D printing.additive manufacturing and later some kind of nanotechnology, will make it possible for small groups to accomplish things that only governments and large corporations can do now. I call this the manufacturing revolution or singularity. It will tip the balance in favor of the individual and small groups.

What is the one thing that can be manufactured easily without expensive equipment?

Software.

Who does actually make most of commercial software?

Large corporations.

Expand full comment

Don't worry. They will just buy off the state to force the rest of us to cover their losses. Public risk, private profit. It's the bankster way.

Expand full comment

New readers might not realize that Miller didn't just post here once but a decent number of times.

Expand full comment

But weapons development is a time and resource intensive task, making it extremely unlikely that the villains small team of followers could out-innovate all of the weapons developers in the rest of the world by producing spectacularly destructive instruments that no other military force possessed.

Generally this is true, especially for nuclear weapons technology. However, there are two caveats that must be mentioned. One, large institutions such as governments and large corporations are bureaucracies, and it is common knowledge that bureaucracies have a hard time with innovation. Two, we talk a lot of the AI revolution, which is still mostly theoretical to me. However, the parallel revolution in manufacturing; 3D printing.additive manufacturing and later some kind of nanotechnology, will make it possible for small groups to accomplish things that only governments and large corporations can do now. I call this the manufacturing revolution or singularity. It will tip the balance in favor of the individual and small groups. Indeed, Peter Thiel is adamant that this revolution is absolutely essential for the preservation and expansion of individual liberty. I completely agree with him on this point.

Governments and large corporations are dinosaurs, and rightly deserve extinction. However, they will not go quietly into the night. The pursuit of liberty, as always in the past, will require struggle.

I remain skeptical on the promise of A.I. We still know little about neurobiology and even when we do understand it, modeling it on semiconductor-based computers using software will prove to be a very difficult feat.

Expand full comment

Is this book's writing quality much higher than the author's articles available online? I looked at some of the articles linked from Miller's resume and isn't impressed to be honest. This includes some very dubious "maths" in the Politics of Immortality article in H+ magazine. Is there some preview chapters available online?

Expand full comment

It seems to me that we already have ultrapowerful AIs, foremost among them the Googleplex.  If it lacks any particular motivations or desire to grow, that's only because it isn't clumsily welded into an ape.

Expand full comment

Singularity was already an old idea when Kurzweil's last book came out and when you and Yudkowsky had your debate.From your review, it doesn't seem that this book adds anything significant.

Expand full comment

Normally, investors invest in order to help ensure they benefit more than non-investors from any resulting changes. They are unlikely to stop wanting this - and if others don't want their investments, then their loss probably won't be missed. 

Expand full comment