12 Comments

I think there actually is a relevant difference between these two scenarios, it just doesn't have very much to do with artificial brain vs. artificial heart. Rather, it's about who's in control here, the owner or the manufacturer. This is the same sort of issue that has come up in a number of places nowadays, as e.g. DRM vs right-to-repair. If the artificial heart used DRM so that it could only be operated on by the company that made it, and had forced updates I couldn't turn off, yeah, I'd be pretty wary! In the heart case, the heart is inside you; ideally you can go to any surgeon for repair or modification, and maybe even make software modifications yourself. (But note that it's not enough that it's inside you; it also has to be the case that the manufacturer hasn't implanted software-based means of control into it that they can use to prevent you from doing these things.)

The em scenario differs from merely being an artificial brain in that it's an artificial brain controlled by someone else. Some of Newitz's complaints seem quite salvageable to me and I'm not sure you've glossed them fairly. I've already done this somewhat above. Let me address the payment angle. I think a better way of looking at this complaint, rather than, manufacturers want to be paid, is, imagine the manufacturer had the ability to inflict additional charges upon you that you didn't agree to in advance. If you control it -- you bought it, it's yours now, and they can't stop you from doing what you will with it -- you don't have to worry about that. If it's under their control... maybe they can, and likely they can get away with it.

Expand full comment

Ah, have a link?

Expand full comment

She has a long history of batshit crazy anti-technological and anti-transhumanist rants.

No problem, we just won't invite her to our upload community.

Expand full comment

But since the technology for reviving a frozen brain may not be available until long after death, "in advance" may be the only option for purchasing an artificial brain. You could say that the immortal corporation you've entrusted with your brain will do its fiduciary duty by selecting an optimal artificial brain once those become available, but I think the same agency problems exist.

In a hypothetical involving an artificial heart, death seems to be the main thing someone could threaten you with to extort you. I suppose it could be possible to non-fatally torture someone via their artificial heart though, I'm hardly an expert on that, but it seems like there would less ability with that compared to a full simulation via artificial brain.

Expand full comment

Apart from the simply arithmetic that 2 copies = twice the chance? For one, torturing a copy is nigh undetectable, as opposed to a regular kidnapping which will be quickly noticed by your going missing.

I think the social norms about surgical/medical ethics is pretty much a settled question at the moment... legality of em torture & enslavement remains an unanswered question, not to mention that you'll have to assume that it never becomes legal or socially acceptable to torture/enslave an em, since ems can be preserved indefinitely.

Perhaps a thousand years later, our descendants have diverged enough and no longer share our values.

Expand full comment

At most that's an argument against buying artificial brains in advance, not against the basic concept.

Suicide is rarely an option for someone being tortured.

Expand full comment

Newitz didn't make the argument, but it's relevant nonetheless. How often do people buy artificial hearts for their own use that long in advance? Even a long-lived corporation would expect newer and presumably cheaper models to be available as technology advanced over that timeline.

Are you saying suicide wouldn't be an option for someone being extorted via their artificial heart?

Expand full comment

I don't see how copying and changed speeds substantially change the chances of slavery or torture. And the lack of social norms during early adoption applies to any new product, including artificial hearts.

Expand full comment

ANYTHING bought via delayed purchase is more problematic. Buying an artificial heart 50 years in advance would also have more problems. There's no intrinsic connection between artificial brains and delayed purchase, and Newitz didn't give that argument.

In practice, suicide just isn't much of an option for most people actually tortured.

Expand full comment

I can think of some relevant reasons why the scenarios are different. If you freeze your brain now, it may not be converted into an em for a long time, by which time you may lack close allies to monitor what happens and act in your interest. There's also the scenario of being subjected to torture and not even with the option of death as an out, which is past the limit at which people controlling your artificial heart can extort you.

Expand full comment

Being kidnapped post-surgery and tortured vastly is a vastly less plausible scenario that an em be pirated and tortured. We have norms and legal systems protecting the former, billions of people have undergone surgery and not been kidnapped and tortured afterwards, so your chances are pretty good (though I'm sure at least one unfortunate soul would have been).

I don't think it's an unfair criticism that an em would have a greatly increased chance of being tortured or enslaved, given that they can be copied and run at accelerated speeds, and plausibly the lack of copy-protection, legal rights, or social norms especially during early adoption.

Of course, concluding that nobody would want digital immortality because of these risks is quite the leap.

Expand full comment

My friend refers to this type of thinking as “horribilization”. It’s quite common.

Expand full comment