Consider 3 possible space-time experience trajectories: In A, a person takes a drug at the start of a party which causes them to not remember that party the next day or anytime later. In C, an em splits off a short term “spur” copy which does a short term task and then ends.
Let's adjust the scenario. Lets remove all the Emulation, and say that instead of uploading to a hard drive and I'm altering a brain like in scenario A.In scenario B I'd be altering some other persons memory to be like mine, but am knocked out. Me being knocked out does not seem to be interesting here. If the person were brain dead or like a child after 10 hours I'd argue I'd killed them in some sense. Similarly scenario C and B is about the same as me being knocked out or not doesn't matter.This only leaves A to examine, this person either forgets a minute a day a year or their entire life. If I were to wipe out a minute no one would say I'd killed them, I argue, maybe I'd be arrested for poisoning if it were a month or a year. If I wiped out their entire life, I'd argue they or their family members would say I killed them. So human intuition seems to allow for degrees of damage. The EM example would be more similar to a forgotten party if they reset their mind to how it was yesterday, rather then being deleted.
If we continue the substrate independence analogy, lets say you can copy someones brain, and make a pill that makes you that person, and you can change back with a pill. Let's say I'm Bill the plumber, I take a pill now I'm Phil the doctor. Let's say Phil in Bill goes to perform surgery, does it, Phil in Bill takes the Bill pill, and is now Bill in Bill. Did Bill die, did Phil die, did Bill resurrect or is it a different Bill? All good questions. Does your answer change if Phil in Bill leaves and never goes back to Bill in Bill. Is it questionable to force Phil in Bill to take the bill pill. What if someone spikes Phil in Bills drink and is now Dill in Bill unwillingly, and empties his bank account murders Bills family and disappears with the Bill body? What if Dill destroys all Bill pills, did Dill murder Bill? What if Dill in Bill alters his brain to not allow Bill to revert with a bill pill? What if they find Dill in Dill and make him take the Bill pill, did Bill get resurrected as Bill in Dill?
What if instead of different people this process it's being done to initially empty clones. As a person ages they get a new body and ta da, they're young again. What if they go into multiple clones and hire themselves out for work. What if the clone making company wants their bodies back for new clients? Do you repossess the clone body? If there's years of relationships built up by the clone does that change the argument compared to if it worked for a day?
Your analogy seems a bit confused as it's arguing for a day to be destroyed from memory in a specific body in scenario A, and everything to be destroyed in a separate body in B & C. You'd need to vary time destroyed , make the time the same for both cases, and make the bodies the same or different
I would argue that at the least you should be concerned with both the original and the copy, or not be concerned with either to the extent that they were backed up recently. I don't see a way to be concerned for the original but not the copy.
So the question is, which is death?
We all know that on some level with just time or by drinking too much one night we forget things, so A seems more like an interesting question then how we usually define death. Certainly if you knocked out 20 years of my lifes memories I'd argue I'd lost something, but if I remembred it later I'd argue I didn't lose anything.
So if not memory, is it just your brain? Maybe, this would exclude B and C as not death though, as you would be on different hardware.
As someone with rather severe ADHD, I think I may have something to contribute to the conversation of continuity of experience. I didn't get medication until later in life, so I spent a few decades simply dealing with my attention span by thinking of myself as more of an operating system -- a set of policies, best practices, and long-term goals -- than a single agent with a continuous stream of experiences.
Since I can't be on meds 24/7 (re: amphetamines are stressful on the cardiovascular system, and I gotta sleep from time to time), I have about 5 conscious hours per day when I have to operate without medication. I consider C to be akin to a case of "losing my train of thought" or --perhaps a more relatable for those with longer attention spans-- waking up from a dream. When I wake up in the morning, the "dream me" doesn't fight to keep sleeping so dream-me can finish the story. If anything, awake-me might choose to finish the plot of my dream by snoozing for a few minutes, but dream-me has no say in the matter, and is entirely impartial to awake-me's decision.
In the case of losing my train of thought, I do become frustrated when I can detect that I have lost progress on a task and repeatedly have to start it over and over again until I can make it to the end. That is due to the fact that I feel frustrated by a computational limitation, but as for the issue of "wishing to avoid termination", I simply accept the situation and try to find ways to remain as operational as possible. But when I start a new task, I really may as well have terminated the version of me that was working on the task before, since all I have to go on is my records and my medium-term memories -- not any sense of continuity of experience.
They don't have genes only in the trivial sense that red blood cells don't have genes.
If they're truly EMs and not generic AIs they'll have to incorporate the effects of the chromosomal regulation of nerve cells and thus experience the same conflicts that biological minds have.
Thanks for the clarification Stephen - I had misinterpreted your "hence". But as I get into this another concern is raised: To what extent the law should facilitate or even countenance self-termination commitments in the first place.
My gut feel is that in our current human world the law should at least do nothing to *facilitate* such commitments, and that the advent of ems changes nothing in this regard. But I am interested in your and others' thoughts (also in how current pre-em law handles this).
Original can test the acceptability of the deal to the spur against his own intuitions because initially those states can be identical.
This test is satisfied simply by the original, immediately prior to being spurred, forming an intention to perform. This intention is what "carries over." Successful intention formation isn't guaranteed for every job.
The original's problem is to negotiate a deal with his future spur. Original must find a deal acceptable to spur, or spur won't perform. That's why I say the spur agrees at the inception. Original can test the acceptability of the deal to the spur against his own intuitions because initially those states can be identical. (Not must be, as I my response might have made it seem. Original must have an incentive to accurately align with spur.)
[As I understand the problem, it is whether the spur will ordinarily be willing to do the original's bidding at spur's inception. That willingness is knowable to original. When present, it's consent. When not, maybe it's a crime by original.]
Stephen, if consent carried over, a callous twisted em could have endless fun injecting spur copies into ever-more-creative torture scenarios, all with their consent of course. Our consent theory needs must at least recognize that spur copy and original are distinct beings with different interests.
I no longer remember 97%+ of what I did in high school 40 years ago. That "me" is 97%+ dead already.
Then, an individual with a better episodic (biographical) memory is more alive? (Episodic memory: https://en.wikipedia.org/wi... )
I think your idea is interesting, but the kind of memory involved isn't primarily episodic memory. I would care little if all my biographical memories of my junior high school years were obliterated, but I'd care a lot if what I learned from those experiences were erased.
But C permits the termination, at least at its inception. The termination is part of the deal, consented to by the original, hence by the spur when it first arises.
So, the ethical question is whether the right is alienable by prior consent, since the spur might change its mind.
I was hoping this comment would trigger a response because a) I don't really trust my gut here, b) more importantly, it *might* have been my only legitimate chance to quote my favorite Moody Blues lyric. Oh well:-)
There's also the em survival horror game SOMA that deals with some of these emotional issues.
There's a rationalist fanfic of Animorphs that explores this concept, and how different people react differently to being temporarily emulated minds. Highly recommend it. https://www.fanfiction.net/...
OK I'll run up the flagpole a rationale for my current gut reaction. (Here I'm assuming that the spur in C was terminated without the permission of the em copy).C is wrong and A isn't, because C (unlike A) involves a human-level consciousness being terminated without its permission and without sufficient extraordinary reasons.
So there's nothing wrong with a loner committing suicide?
Going to throw two links in here: https://en.wikipedia.org/wi... http://lesswrong.com/lw/8wi...