My book’s topic seems to me so obviously important that I figure a reader’s main question must be whether he can trust me to actually know something on it. As a result, potential readers should be especially interested to hear criticisms; where do reviewers think my book gets it wrong? And as the book draws on many disciplines, readers should be especially interested in expert criticism, i.e., reviewers who find fault in an area they know well. Let us consider the reviews so far.
Most product markets today have a limited number of suppliers, and this would also be true in em labor markets. So the question is how many different kinds of workers are there really, for the different kinds of jobs to be learned. I estimate a thousand humans times a thousand ways to train and tweak them should be plenty.
Robin, you mentioned in a podcast with Stephen Cobb that out of the human population at the time when em revolution occurs, the em economy will pick 100 or so best humans (whatever that means) to be scanned and make billions of em copies of them.
Don't you think 100 is too few and we might run into unforeseen problems of having too small "base" for the em copies? And if you would be in charge of the selection process, what characteristics you would be looking for?
I haven't yet read your book so I don't know if the answer can be found there, but I think this is quite important point if the whole economy is based on these 100 individuals.
I think Alexander's fear that the em world will inevitably become unconscious and soulless makes much more sense in light of a Yudkowskian view of intelligence. If intelligence is just this one simple algorithm, and the reason human minds use heuristics and have biases and so on isn't because those features are functional, but is just because evolution sucks so badly, then it seems entirely reasonable to expect that in a world where the brain's code can be edited, these characteristics would be edited out - in the medium term even if not immediately.
The only people I know who are obsessed with being in the flow-state to the exclusion of any other interests (experiencing novel things, being outside, maintaining relationships) are video gamers with impulse control problems. They're some of the most unhappy people you'll ever meet.
I am moderately confident that future reviews of your book, such as those to be written 50, 100, and 200 years from now, will be far more interesting and worth reading than any being written today.
Experiencing a single type of "highest fulfillment" constantly for all eternity is a form of wireheading. Opinions of that state's moral value differ.
Regarding "zero leisure," Alexander makes a normative argument, but it's unconvincing. The mainstream in psychology is that humans' highest fulfillment comes from a state of 'flow' - and that's also the most productive state for humans to be in. In expectation, then, if the em world manages to somehow undo the constraints of r&r on working, ems would be in a constant state of flow. This isn't terrifying; it's a Utopia.
I'm a bit surprised that the reviews don't complain about the tone of the book being too certain, almost arrogant, even though you state yourself that there are many uncertainties when predicting the future. I know that this is your style and personally I have no problem with it, but it may be the reason why some reviewers struggle almost desperately to find factual errors in it.