Discussion about this post

User's avatar
ZS's avatar

Thanks but really can't agree with you Robin. Me and many of my friends and intellectuals we follow such as Eliezer Yudkovsky are actually people who very much welcome new technologies in general, yet we think that super human AI could quite likely spell the end of the human race in the same way that humans have eliminated many less intelligent species. This is NOT a general fear of the future or of change, it is a rational analysis of the dangers of AGI which is a challenge not similar to any the human race has faced in the past. AGI might be more analogous to being invaded by super-intelligent grabby aliens which I think you agree is a real danger.

Expand full comment
Dean Valentine's avatar

The additional wrinkle of "all humans die" seems important, but is not obviously highlighted as a difference between the two scenarios.

Expand full comment
88 more comments...

No posts