Nick Bostrom’s new tome … has a great cover with a number of interesting questions and a subtitle that hints that it might address the meaning of life in a future where AI and robots can do everything. But alas, after much build up and anticipation, he leaves that question unanswered, with an abrupt oops, out of time on page 427. … He tries to address meaty topics like, what keeps life interesting? What is our purpose and meaning when the struggle is gone? Can fulfillment get full? But in each case, the pedagogy is more of a survey of all possible answers versus the much more difficult task of making specific predictions. (
At equilibrium, creatures in an ecosystem either contribute to that ecosystem or are considered parasites. What will humans contribute in this future utopia?
You can only see about 5000 stars in the night sky. So "descendants as numerous as the stars in the sky" should be quite achievable, with a long enough time frame. We just need to avoid this birth rate collapse thing.
"Similarly, a great many futurists try to imagine crazy advanced technology and social institutions in the context of cultural values quite close to their own"
Great observation. Flying cars are a great example of an old prediction which is now technically possible although no closer to actuality. Unless you are living in a post-scarcity world, it will always be hard to justify the added vehicle/fuel expense of going airborne to shave a few minutes off your commute. It is easier to imagine a future with superior urban planning or where we use VR to work remotely somehow. This phenomenon is relatively common if you go back to old futurist predictions. Many are now technically feasible but have no place in our modern world; a surprising amount of them appear to be altogether useless.
>After all, Bostrom says little about honor, of enjoying the lamentations of those you conquer, or of having “descendants as numerous as the stars in the sky and as the sand on the seashore.”
I really think this is the key insight here. As foreign as these values seem to us moderns, they were evidently crucial to the survival of past societies (i.e., almost certainly not evolutionary "spandrels"). I haven't read Bostrom's book, but if he neglects this perspective I suspect it is because he doesn't fully appreciate the importance of raw violent power in shaping the social/adaptive landscape. That's not too surprising, since Western states often endeavor to conceal that unsightly fact--although one would hope that world class philosophers and scientists would not be fooled.
I feel like the assumption that creatures like you or I won't be in charge is itself projecting onto AI our features. I am still not at all convinced why those actors would gather power to themselves or act in their interest when opposed to ours absent being designed to do so.
What does a future with superintelligent AI look like? It's a fascinating question but I think almost by definition the answer of any single person will be biased and unreliable. I would much rather see a book with chapters written by 20 diverse people across different fields.
The original Foundation novels are like this. Interstellar travel is possible, but relations between men and woman are pretty much the way they were in 1950. No woman holds any position of industrial or political power.
I'm not actually sure about this. Would the world today seem that foreign to a Roman from 2000 years ago?
I bet many of them would be disappointed by how little has changed.
I also think billions of years can pass in shocking stagnancy if some person or group manages to get totalitarian control over the world, as we are currently in dire threat of, and which may have already de facto happened.
At equilibrium, creatures in an ecosystem either contribute to that ecosystem or are considered parasites. What will humans contribute in this future utopia?
You can only see about 5000 stars in the night sky. So "descendants as numerous as the stars in the sky" should be quite achievable, with a long enough time frame. We just need to avoid this birth rate collapse thing.
"Similarly, a great many futurists try to imagine crazy advanced technology and social institutions in the context of cultural values quite close to their own"
Great observation. Flying cars are a great example of an old prediction which is now technically possible although no closer to actuality. Unless you are living in a post-scarcity world, it will always be hard to justify the added vehicle/fuel expense of going airborne to shave a few minutes off your commute. It is easier to imagine a future with superior urban planning or where we use VR to work remotely somehow. This phenomenon is relatively common if you go back to old futurist predictions. Many are now technically feasible but have no place in our modern world; a surprising amount of them appear to be altogether useless.
>After all, Bostrom says little about honor, of enjoying the lamentations of those you conquer, or of having “descendants as numerous as the stars in the sky and as the sand on the seashore.”
I really think this is the key insight here. As foreign as these values seem to us moderns, they were evidently crucial to the survival of past societies (i.e., almost certainly not evolutionary "spandrels"). I haven't read Bostrom's book, but if he neglects this perspective I suspect it is because he doesn't fully appreciate the importance of raw violent power in shaping the social/adaptive landscape. That's not too surprising, since Western states often endeavor to conceal that unsightly fact--although one would hope that world class philosophers and scientists would not be fooled.
I feel like the assumption that creatures like you or I won't be in charge is itself projecting onto AI our features. I am still not at all convinced why those actors would gather power to themselves or act in their interest when opposed to ours absent being designed to do so.
What does a future with superintelligent AI look like? It's a fascinating question but I think almost by definition the answer of any single person will be biased and unreliable. I would much rather see a book with chapters written by 20 diverse people across different fields.
Issac Asimovs work is robots with creaky 1940s political stances
The original Foundation novels are like this. Interstellar travel is possible, but relations between men and woman are pretty much the way they were in 1950. No woman holds any position of industrial or political power.
Frank and later Brian Herbert explore this topic in the Dune series
Have both you and Nick read the entire series ? Many done take sci fi writing seriously
I'm not actually sure about this. Would the world today seem that foreign to a Roman from 2000 years ago?
I bet many of them would be disappointed by how little has changed.
I also think billions of years can pass in shocking stagnancy if some person or group manages to get totalitarian control over the world, as we are currently in dire threat of, and which may have already de facto happened.