A few years ago, my co-blogger Eliezer Yudkowsky and I debated on this blog about his singularity concept. We agreed that machine intelligence is coming and will matter lots, but Yudkowsky preferred (local) “foom” scenarios, such as a single apparently-harmless machine in a basement unexpectedly growing over a weekend so powerful that it takes over the world, and drifting radically in values in this process. While Yudkowsky never precisely defined his class of scenarios, he was at least clear about this direction.
When Is “Soon”?
When Is “Soon”?
When Is “Soon”?
A few years ago, my co-blogger Eliezer Yudkowsky and I debated on this blog about his singularity concept. We agreed that machine intelligence is coming and will matter lots, but Yudkowsky preferred (local) “foom” scenarios, such as a single apparently-harmless machine in a basement unexpectedly growing over a weekend so powerful that it takes over the world, and drifting radically in values in this process. While Yudkowsky never precisely defined his class of scenarios, he was at least clear about this direction.
Comments on this post are for paid subscribers