Yesterday I talked about one big change with ems (future whole brain emulation robots) – they’d mostly be workaholics. Another other big change with ems, I think, is their concept of and attitude toward “death.” Ems would often agree to make copies of themselves, copies which they expected would only last for a limited time, such as one year. But they’d mostly be fine with this. Let me explain.
Imagine that you have lost all memories of some period of your life, say a period one year long. You still have pictures, letters, a diary, some video, memories in others you can talk to, etc. But while it all sounds like the sort of thing you might have done, you don’t additionally recall doing any of it. How much would this memory loss degrade the value of your overall life? I’d say it would be far worse to have not lived that year at all, say being put in suspended animation, than to merely have lost the memory of that year.
Now imagine that, because you had access to a time machine, this lost year happened at the same time as one of your other years. During 2006, for example, you were off experiencing 2005 all over again, but in another place, and then you forgot it all, expect for the pictures, etc. For me, this would not much degrade the overall value of my life. It would again be a bit sad not to remember that year, but its not a big deal when it happened.
Now imagine that you could use this time machine to both experience 2005 twice, forgetting one of the parts, and also to experience 2006 as usual. Here you’d be adding one more year onto your life, which I’d consider great. If the cost of having one more year of life were that you don’t fully remember that year, to me that would be a small price to pay.
For an em who shared my attitudes here, the option to spawn a new copy who only lasted a year would be much like the option to live another year longer, but without remembering it. Mostly a good deal, at least if you liked your life during that time. Yes the copy might be sad when his year came to an end, knowing his detailed memories of that year would not last. But he’d usually expect that “he” would continue to exist through other copies. He wouldn’t consider this harm to be remotely as large as what we call “death” — the end of anyone who remembers our life in some detail.
Ems would start as scans of humans, but not of random humans – the humans would be chose for their productivity and their acceptance of the em patterns of life, and “death.” As a result, ems would mostly be fantastically-capable workaholics who were not greatly bothered by “death” given the existence of other close copies. Since they seem to me quite “human” with lives well worth living, I consider the em revolution to be far more glorious than horrifying.
That depends on how you define "continuity". If the entity believes it is continuous with a human entity, who is anyone but the entity who believes it to disagree?
Would you require that the entity be scanned and its thought processes traced and emulated so you can figure out how it arrives at its belief that it was formerly a particular human entity?
Is that the standard you use to decide if a human is who he/she says he/she is?
People can lose large fractions of their brain and still believe they are the same entity as before. Gabrielle Giffords lost a big chunk of her brain. Is she the same entity she was before? Is she still the person that was elected? She lost more than a “few lines of code”.
What kind of “proof” is necessary to establish entity continuity?
What ever “self-identity module” a human has, that could be instantiated as a subroutine in an EM and then called upon with a few lines of code when needed. There could be millions of self-identity modules. Which ever one is “active” has the hundreds of trillions of lines of code (or whatever) that a human needs to have self-identity activated.
Yes the copy might be sad when his year came to an end, knowing his detailed memories of that year would not last. But he’d usually expect that “he” would continue to exist through other copies. He wouldn’t consider this harm to be remotely as large as what we call “death” — the end of anyone who remembers our life in some detail.
I completely disagree. What if someone were to tell you tomorrow that you've outlived your usefulness and you'll be euthanized in a week. But don't fret, you're actually a clone and somewhere out there, there's the original version of you that sold the license for this Robin to work as an economist until it was deemed you no longer was very good at it. You would take little comfort in the fact that another Robin lives on.
And how long is it before a person gains their own identity? You said one year. What about five, or ten? When do you stop being the same consciousness as your other copies and start being wholly unique. The brain isn't a static object. And I assume the emulated brain will change with its new experiences.