We think about tech differently when we imagine it before-hand, versus when we’ve personally seen it deployed. Obviously we have more data afterward, but this isn’t the only or even main difference. Having more data puts us into more of a near, relative to far,
In a destructive scan, the original body is destroyed in the process of creating the scan.
If you are told that people who walk through a door enter aa higher plane, but you never heard from them or have any evidence that they ever have any effect on anything, you are right to be skeptical. If you can talk to them on Skype all the time, their actions have clear impacts on things you care about, and in both ways they seem like the people you knew, you'd be a lot less skeptical.
...a significant number of humans would accept destructive scanning to become ems.A link or parenthetical explanation would be helpful. If destructive scanning mean physical death, then we are back to the 'uploading consciousness' trope of science fiction and transhumanism.
My hesitancy about transporters is based on a scene from one of the original Star Trek movies. There was a malfunction, and reassembly at the destination was botched, because the people arrived "turned inside out", as Scotty described it. I acknowledge that that is a safety concern rather than a philosophical one (of whether one's identity/essence is preserved).
Returning to destructive scanning: The introduction, about contrarian perception of futurist tech, stirred memories of another movie, Logan's Run. Citizens of that seeming utopia were led to believe that at age 35, they were transformed to a higher plane of existence through a fiery public ceremony. In fact, they were just killed for pragmatic reasons, e.g. population control. That's another reason to be skeptical of destructive scanning. Even if the tech worked, could those who were in charge of the implementation be trusted?
Unless the optimal algorithm for a specific task is human I'd expect them to lose any humanlike traits with time. Not by themselves, of course, but by being modified or replaced with better algorithms. I'd expect human brains to be awful according to the "job done per processor instruction" metric as they're not at all optimized for that by evolution. Modelling the whole brain instead of a much simpler "do-the-work subroutine" is a huge waste of resources. I'd imagine a developed em corporation as a simple "contract-drafter" AI, a simple "tester" AI instead of introspection, and so on. The AIs may be modified ems or be written by first ems. Maybe if we add a high-speed mind-to-mind communication to the equation we still get something like sentient corporation even though none of its workers are sentient, but that'd just make it a nonhuman AI with all of its problems.
Brain parts dedicated to sound and sight processing might be greatly shrunk. Emotions and introspection seem to be useful in general for social creatures - why assume the future has no use for them?
Only when such a theory was available and confirmed by a majority of the scientific community would I be prepared to step into the transporter....
So, even if it's been done countless times and the psychological sameness of the copy and original have been substantiated from every empirical angle, you would refuse to enter the transporter unless you could theorize your identity? Because your copy might not really be you? You fear you have something to lose, although you don't know what it is?
I'm still skeptical about the feasibility of the "human-like ems under high restrictions" scenario. Ems are not at all the optimal algorithm for doing their jobs. A large part of human brain is about controlling the body and processing a lot of signals from it. Most of these signals along with the whole body are dead weight if you care about algorithm efficiency. Same with many of humans' emotions and introspection. So either they're rapidly displaced by designed-from-scratch algorithms or by modificants (~ems with unnecessary parts cut out) resulting in a nonhuman algorithm swarm or they're not under strict resource limitations and can use the resources to model forests and penthouses and stuff which only need a tiny part of the resources required to run an em.
If they imagine they are about to enter a transporter, only half of them see their identity as preserved. But if they imagine that they have just exited a transporter, almost all see their identity as preserved. Exiting evokes a nearer mental mode than entering, just as history evokes a nearer mode than the future.
I haven't seen this asymmetry between the recent past and recent future noted in the CLT literature. (A priori, they are equally distant.)
The Walker link is broken. [Seems to me the post-transport certainty might be implied by the question: if it's indeed you, then identity has been maintained.]
> Opponents of the first steam trains argued that train smoke, noise, and speeds would extract passenger organs, prevent passenger breathing, disturb and discolor nearby animals, blight nearby crops, weaken moral standards, weaken community ties, and confuse class distinctions.
The uterus thing is pretty reasonable a fear; I mean, my neighbor's dog died from bouncing around on a boat and getting his intestines/stomach flipped (bloat/twisted stomach/gastric torsion is apparently a surprisingly common killer of dogs), and early railroads were far from a smooth ride. You also can't deny that trains do blight nearby crops through pollution and damage (anyone remember brake sparks and Coase?) which is why they need larger right-of-ways and part of why railroads go hand in hand with large governments and eminent domain and land grants. As far as 'prevent passenger breathing' goes - that is actually a myth, and specifically, it is a myth attributed to Dionysius Lardner (the link doesn't name him but I can tell because it includes the '20 mph' telltale bit); considerable effort by myself and others has not turned up any such quote before the 1980s and evidence that it was confused with genuine and reasonable concerns of Lardner (for example, about lack of ventilation in railroad tunnels, which, like in mines, has killed many people), see https://en.wikipedia.org/wi... https://en.wikiquote.org/wi... https://en.wikiquote.org/wi... So a classic urban legend I am a little surprised to see repeated in _History Today_ (incidentally, 'bicycle face' is another urban legend which fits the 'absurd Victorian fears of new technology' template). And of course, trains certainly did break up communities and assist the transition to modern laxer forager moral standards; we think that was a good thing and much of the point, but fears of that were not wrong, any more than aboriginals would be wrong in fearing the introduction of roads, cars, and modern agriculture will destroy their traditional way of life. They're wrong on values, not facts.
In a destructive scan, the original body is destroyed in the process of creating the scan.
If you are told that people who walk through a door enter aa higher plane, but you never heard from them or have any evidence that they ever have any effect on anything, you are right to be skeptical. If you can talk to them on Skype all the time, their actions have clear impacts on things you care about, and in both ways they seem like the people you knew, you'd be a lot less skeptical.
What is destructive scanning?
...a significant number of humans would accept destructive scanning to become ems.A link or parenthetical explanation would be helpful. If destructive scanning mean physical death, then we are back to the 'uploading consciousness' trope of science fiction and transhumanism.
My hesitancy about transporters is based on a scene from one of the original Star Trek movies. There was a malfunction, and reassembly at the destination was botched, because the people arrived "turned inside out", as Scotty described it. I acknowledge that that is a safety concern rather than a philosophical one (of whether one's identity/essence is preserved).
Returning to destructive scanning: The introduction, about contrarian perception of futurist tech, stirred memories of another movie, Logan's Run. Citizens of that seeming utopia were led to believe that at age 35, they were transformed to a higher plane of existence through a fiery public ceremony. In fact, they were just killed for pragmatic reasons, e.g. population control. That's another reason to be skeptical of destructive scanning. Even if the tech worked, could those who were in charge of the implementation be trusted?
Unless the optimal algorithm for a specific task is human I'd expect them to lose any humanlike traits with time. Not by themselves, of course, but by being modified or replaced with better algorithms. I'd expect human brains to be awful according to the "job done per processor instruction" metric as they're not at all optimized for that by evolution. Modelling the whole brain instead of a much simpler "do-the-work subroutine" is a huge waste of resources. I'd imagine a developed em corporation as a simple "contract-drafter" AI, a simple "tester" AI instead of introspection, and so on. The AIs may be modified ems or be written by first ems. Maybe if we add a high-speed mind-to-mind communication to the equation we still get something like sentient corporation even though none of its workers are sentient, but that'd just make it a nonhuman AI with all of its problems.
Experiments suggest that thinking generally doesn't cease during sleep; sleep seems like a blank stretch of time because memories aren't laid down.
General anesthesia might be a better point of reference.
Brain parts dedicated to sound and sight processing might be greatly shrunk. Emotions and introspection seem to be useful in general for social creatures - why assume the future has no use for them?
Only when such a theory was available and confirmed by a majority of the scientific community would I be prepared to step into the transporter....
So, even if it's been done countless times and the psychological sameness of the copy and original have been substantiated from every empirical angle, you would refuse to enter the transporter unless you could theorize your identity? Because your copy might not really be you? You fear you have something to lose, although you don't know what it is?
Choice and familiarity for starters. Also the swapping out of every atom in your body.
I'm still skeptical about the feasibility of the "human-like ems under high restrictions" scenario. Ems are not at all the optimal algorithm for doing their jobs. A large part of human brain is about controlling the body and processing a lot of signals from it. Most of these signals along with the whole body are dead weight if you care about algorithm efficiency. Same with many of humans' emotions and introspection. So either they're rapidly displaced by designed-from-scratch algorithms or by modificants (~ems with unnecessary parts cut out) resulting in a nonhuman algorithm swarm or they're not under strict resource limitations and can use the resources to model forests and penthouses and stuff which only need a tiny part of the resources required to run an em.
If they imagine they are about to enter a transporter, only half of them see their identity as preserved. But if they imagine that they have just exited a transporter, almost all see their identity as preserved. Exiting evokes a nearer mental mode than entering, just as history evokes a nearer mode than the future.
I haven't seen this asymmetry between the recent past and recent future noted in the CLT literature. (A priori, they are equally distant.)
The Walker link is broken. [Seems to me the post-transport certainty might be implied by the question: if it's indeed you, then identity has been maintained.]
It's more effective to ask for forgiveness than permission.
OK, I crossed out the part about breathing problems in trains.
> Opponents of the first steam trains argued that train smoke, noise, and speeds would extract passenger organs, prevent passenger breathing, disturb and discolor nearby animals, blight nearby crops, weaken moral standards, weaken community ties, and confuse class distinctions.
The uterus thing is pretty reasonable a fear; I mean, my neighbor's dog died from bouncing around on a boat and getting his intestines/stomach flipped (bloat/twisted stomach/gastric torsion is apparently a surprisingly common killer of dogs), and early railroads were far from a smooth ride. You also can't deny that trains do blight nearby crops through pollution and damage (anyone remember brake sparks and Coase?) which is why they need larger right-of-ways and part of why railroads go hand in hand with large governments and eminent domain and land grants. As far as 'prevent passenger breathing' goes - that is actually a myth, and specifically, it is a myth attributed to Dionysius Lardner (the link doesn't name him but I can tell because it includes the '20 mph' telltale bit); considerable effort by myself and others has not turned up any such quote before the 1980s and evidence that it was confused with genuine and reasonable concerns of Lardner (for example, about lack of ventilation in railroad tunnels, which, like in mines, has killed many people), see https://en.wikipedia.org/wi... https://en.wikiquote.org/wi... https://en.wikiquote.org/wi... So a classic urban legend I am a little surprised to see repeated in _History Today_ (incidentally, 'bicycle face' is another urban legend which fits the 'absurd Victorian fears of new technology' template). And of course, trains certainly did break up communities and assist the transition to modern laxer forager moral standards; we think that was a good thing and much of the point, but fears of that were not wrong, any more than aboriginals would be wrong in fearing the introduction of roads, cars, and modern agriculture will destroy their traditional way of life. They're wrong on values, not facts.
It would certainly be satisfying (and perhaps reassuring) to have a robust scientific theory of consciousness.
But we don't so far.
So how do you know that when you go to sleep at night, it's the same *you* that wakes in the morning?
I think you don't. Yet I suspect you sleep every night without much worry about it.
How is that different from stepping into the transporter?