30 Comments

That depends on how you define "continuity". If the entity believes it is continuous with a human entity, who is anyone but the entity who believes it to disagree?

Would you require that the entity be scanned and its thought processes traced and emulated so you can figure out how it arrives at its belief that it was formerly a particular human entity?

Is that the standard you use to decide if a human is who he/she says he/she is?

People can lose large fractions of their brain and still believe they are the same entity as before. Gabrielle Giffords lost a big chunk of her brain. Is she the same entity she was before? Is she still the person that was elected? She lost more than a “few lines of code”.

What kind of “proof” is necessary to establish entity continuity?

What ever “self-identity module” a human has, that could be instantiated as a subroutine in an EM and then called upon with a few lines of code when needed. There could be millions of self-identity modules. Which ever one is “active” has the hundreds of trillions of lines of code (or whatever) that a human needs to have self-identity activated.

Expand full comment

Yes the copy might be sad when his year came to an end, knowing his detailed memories of that year would not last. But he’d usually expect that “he” would continue to exist through other copies. He wouldn’t consider this harm to be remotely as large as what we call “death” — the end of anyone who remembers our life in some detail.

I completely disagree. What if someone were to tell you tomorrow that you've outlived your usefulness and you'll be euthanized in a week. But don't fret, you're actually a clone and somewhere out there, there's the original version of you that sold the license for this Robin to work as an economist until it was deemed you no longer was very good at it. You would take little comfort in the fact that another Robin lives on.

And how long is it before a person gains their own identity? You said one year. What about five, or ten? When do you stop being the same consciousness as your other copies and start being wholly unique. The brain isn't a static object. And I assume the emulated brain will change with its new experiences.

Expand full comment

Human brains (or anything based on one) does not have any bit of it that runs on 'a few lines of code'. The code does not dictate the thinking process, the code dictates the behavior of individual elements and their interactions and relationships dictates the thoughts and behavior. You can POSSIBLY postulate this for an entirely artificial process, but not one that has any sort of continuity with a human.

Expand full comment

Clifford D. Simak beat David Brin to it in "Good night, Mr. James."

Expand full comment

I could imagine an entity with the life-histories of millions of individuals in its memory. The self-identity module could then be a line of code that specifies belief in self-identity of a particular one of those millions of life histories in its memory. When those few lines of code cause the entity to self-identify as “you”, then the value of the entity becomes infinitely high (to your present self) and when the few lines are changed to direct the self-identity to someone else it is not?

The entity could be “wiped” simply by changing the few lines of code that cause it to self-identify as a different one of the millions of entities in its life-history memories. The memories don't need to be deleted, they can still be there as long as the self-identity code tells the entity that it is someone else, it is that someone else.

Does the entity get upset when it switches self-identity? Not unless there is code that makes it experience angst when its identity switches. Why would the designer of the EM waste code and computational resources on something so useless as angst about unimportant internal code changes?

Expand full comment

What would you do if you were decanted knowing you would be shut off in a year? Would you dutifully perform your appointed task, or would you spend the time trying to escape, and/or displace your parent copy?

I suspect that while we may try to scan people willing to accept (your view of) the em lifecycle, the ems that actually survive and reproduce will be the ones that reject it.

Expand full comment

I am not conflating the two senses of trivial.

You are confusing your feelings that something is important with the thing actually being important. Upthread Hedonic Trader mentions people putting the value of material objects and ideas ahead of their own lives. They are also confusing their feelings that something is valuable with the thing actually being valuable.

You don't have a good definition of what it is that you are considering to be valuable. If you define yourself as the “self-identifying entity” that inhabits a particular piece of meat, then when that piece of meat is destroyed, so is the self-identifying entity that formerly inhabited it. If you define yourself as the “self-identifying entity that believes it is you and that believes it formerly inhabited a particular piece of meat”, then what matters is the belief state of the self-identifying entity. There could be millions of self-identifying entities that believe themselves to be you, electronic and biological.

In an electronic entity, the change that makes it believe it is you could be trivial, a few lines of code.

Expand full comment

Well, admittedly my scenario is implausible, but it's a delicious plot for a story. If the bots REALLY considered themselves to be the same person as the original copy, it would be, in some sense, the original copy's JOB to live as well as possible in order to motivate his clones.

Expand full comment

Should I be more willing to take risks, because even if I am killed in one universe, another me lives on in another? Should I not be sad if an accident kills a loved one because there’s another universe where they live?These "should" questions make far less sense in many-worlds than is usually assumed. After all, you'll always get all possible answers to them, and you'll always make all possible decisions in some universes.

But let's assume there's exactly one duplicate universe, and your loved ones die in this one. I assume that the reason why you're going to be sad is that you have an emotional bond to your loved ones, and now you're going to live on missing them - for a while, at least. So of course you will be sad, because you are alive in a world without them, while still feeling the severed emotional bond.

Otoh, consider this: Assume there is a technology that scans your body and brain (and that of your loved ones) every evening and stores the content to a backup storage. There is a complementary technology that allows to re-create a person's biological body and brain from the backup data in case of an emergency. Your loved ones die in a car accident and are immediately re-created from backup. Will you stop having relationships with them? Or imagine the same thing happens to you - will you accept that your bank account no longer belongs to you?

Expand full comment

If identical twins are two different entities, then a single human is multiple entities over their lifetime. The fact that identical twins are two different entities is fairly self-evident. I don't know about the counterintuitive conclusion you draw from that, but I would say that if my future self time-traveled here, and it was possible for me to die without creating a time paradox, I still wouldn't want to.

People with even very severe traumatic brain damage don’t lose their sense of self-identity even though their brains have been massively reconfigured. Self-identity must be a pretty trivial brain function.You're conflating "trivial" as in "easy" with trivial as in "unimportant." Let me assure you that even if self-identity is a trivial brain function, that that doesn't mean it's not important. Having a heartbeat is likely even more trivial, but it is also of utmost importance.

Let's assume that it turns out the Many Worlds interpretation of quantum mechanics is right. Should I be more willing to take risks, because even if I am killed in one universe, another me lives on in another? Should I not be sad if an accident kills a loved one because there's another universe where they live?

It would be really, really, convenient if every individual em copy didn't become a distinct person with rights every time a new copy was made. But the universe isn't always convenient.

Expand full comment

If identical twins are two different entities, then a single human is multiple entities over their lifetime.

The self-aware module doesn't have to be very sophisticated. Just a subroutine that returns “I am me” when ever questions about identity occur. Define the truth value of that statement to be TRUE and the entity will never doubt it.

People with even very severe traumatic brain damage don't lose their sense of self-identity even though their brains have been massively reconfigured. Self-identity must be a pretty trivial brain function.

Expand full comment

An em dies when the copies can’t be ‘merged’ with the original. Consider multiple copies of ems to be analogous to possible future and past selves. This sounds about right to me. Otherwise you could make Robin's analogy with identical twins. They start out as one person, then a copy is made. I think most identical twins want to live. The only difference I can see between ems and twins that might be important is that ems share memories, not just genes. I'd have to say that if I was an em, I'd probably try to avoid being deleted.

Expand full comment

I think you over-estimate how 'luxurious' the HEAD repository/em would live.

If each em is a complete copy, then any of them could set up shop as the HEAD em, just like you can fork a software project and take over if you offer something new.

An em who tried to abuse its position as merge master to extract more than the cost of forking and copying out new ems and merging everything together would be undercut by one of its own copies (since the copies know how to do it just as well as whatever copy/original is running the merging).

And if you have enough control to prevent any competition - add brainwashing or cripple the copies in some way - well, then they are your slaves and you might as well just use that control to take a percentage of their earnings and let them continue on.

Expand full comment

Reading The Singularity is Near, I've spent a bit of time really contemplating the implications of Mind Uploading.

Assuming Em's were truly "aware", I do not agree at all that an Em copy would be so sanguine about its own termination. I'm not convinced that machine intelligence will ever create awareness, but assuming it did, and assuming I gradually replaced 100% of my biology with technology, even knowing that a "copy" of me was running somewhere else, I would still be aware of "me" as a distinct copy, and I wouldn't want that copy to cease existing.

Awareness is tricky to prove, and I'm not sure I would accept a computer program was aware just because it could present a persuasive argument that it was. It could just as easily be programmed to present a persuasive argument that it wasn't...

I'm no Luddite, I just expect that, as our models of the different regions of the human brain get better and better, we'll discover more and more how little we truly understand awareness. As they say, the more you know, the more you know you don't know.

Expand full comment

David Brin beat you to it. Kiln People

Expand full comment

Since there is no actual "individual" self or "me" but only an cultural / emotional / conceptual nexus around some words and ideas (that exists for obvious Darwinian reasons) Robin's point of view is just as valid as all the others expressed here.

What "you" actually are was never born and can never die. There are not multiple "selves", just one Cosmos experiencing through an uncountable number of dreams of selfhood. This can be perceived directly after deep contemplation (ala Einstein), meditation, certain drugs, or sometimes just a blind, random flash of insight. Your "me" / "I" is just a wetware program module. Or, as Alice said it: "you're nothing but a pack of cards!"

As for "Em" software or Yudkowskian AI, I'm truly not holding my breath for either one. . .

Expand full comment