Imagine that your heart will give out soon, and you can’t get a heart transplant. An artificial heart is available, but someone tells you:
Artificial hearts are hell! The people who make them and the surgeons who put them in will want to be paid for their efforts. Artificial hearts use software, and software can have errors and require updates. And they may charge you for those updates. The artificial heart won’t be exactly like your old one, so the person who lives on with that heart after you won’t be exactly you. Furthermore, you can’t be absolutely sure that after the surgery they won’t secretly wisk you away to a foreign land where you will be enslaved with no rights, forced to clean toilets or fight wars, or tortured just for the fun of it. Why would anyone want an artificial heart?
This seems a crazy weak argument to me, so weak that it seems obviously crazy to make it. Yet Annalee Newitz offers exactly this argument against ems (= uploads):
The idea is that, one day, we will be able to convert all our memories and thoughts into hyper-advanced software programs. Once the human brain can run on a computer – or maybe even on a giant robot – we will evade death forever. Sounds cooler than a flying car, right? Wrong. If they ever exist, uploads will be hell. …
Boris’s brain can live forever inside some kind of virtual world like Minecraft, which looks and feels to him like reality. That means his entire universe is dependent on people or companies who run or manage servers, such as Amazon Web Services, to survive. Boris is going to be subjected to software updates that could alter his perceptions, and he might not be able to remember his favourite movie unless he pays a licensing fee. …
Somebody could duplicate Boris and make two armies of Borises fight each other for supremacy. Or, as Iain M. Banks suggested in his 2010 novel Surface Detail, a nasty political regime might create a virtual hell full of devils who torture Boris’s brain … He could be reprogrammed as a street cleaner, forced to mop Liverpool’s gutters for weeks without respite, …
Is it really a continuation of Boris the person or a completely different entity that has some of Boris’s ideas and memories? And what kind of rights does Boris’s uploaded brain have? He might become the property of whoever owns the server that runs him. … Technology decays and dies, so immortality isn’t guaranteed. So why would anyone want to be uploaded? (more)
Here this is published by what was once my favorite magazine. By an author who says she’s published in the NYT, which calls her new book “breathtakingly brilliant”. What is it about the future that makes people willing to say and accept such crazy things? This seems related to tech-related ingratitude, where people seem willing to call tech firms evil if there is ever any downside whatsoever to using their products. Which also seems pretty crazy.
Added 9am: Some correctly note that we may naturally be more concerned about errors in artificial brains than in artificial hearts. But the large and popular product categories of education, media, and mind-altering drugs are similarly ones where one should be more concerned about errors, because errors change our minds. Just because errors are possible and a concern doesn’t mean there can’t be a huge eager demand, nor does it turn the resulting scenario into “hell”.
I think there actually is a relevant difference between these two scenarios, it just doesn't have very much to do with artificial brain vs. artificial heart. Rather, it's about who's in control here, the owner or the manufacturer. This is the same sort of issue that has come up in a number of places nowadays, as e.g. DRM vs right-to-repair. If the artificial heart used DRM so that it could only be operated on by the company that made it, and had forced updates I couldn't turn off, yeah, I'd be pretty wary! In the heart case, the heart is inside you; ideally you can go to any surgeon for repair or modification, and maybe even make software modifications yourself. (But note that it's not enough that it's inside you; it also has to be the case that the manufacturer hasn't implanted software-based means of control into it that they can use to prevent you from doing these things.)
The em scenario differs from merely being an artificial brain in that it's an artificial brain controlled by someone else. Some of Newitz's complaints seem quite salvageable to me and I'm not sure you've glossed them fairly. I've already done this somewhat above. Let me address the payment angle. I think a better way of looking at this complaint, rather than, manufacturers want to be paid, is, imagine the manufacturer had the ability to inflict additional charges upon you that you didn't agree to in advance. If you control it -- you bought it, it's yours now, and they can't stop you from doing what you will with it -- you don't have to worry about that. If it's under their control... maybe they can, and likely they can get away with it.
Ah, have a link?