“It’s classified. I could tell you, but then I’d have to kill you.” Top Gun, 1986
Today, secrets are lumpy. You might know some info that would help you persuade someone of something, but reasonably fear that if you told them, they’d tell others, change their opinion on something else, or perhaps just get discouraged. Today, you can’t just tell them one implication of your secret. In the future, however, the ability to copy and erase minds (as in am em scenario) might make secrets much less lumpy – you could tell someone just one implication of a secret.
For example, what if you wanted to convince an associate that they should not go to a certain party. Your reason is that one of their exes will attend the party. But if you told them that directly, they would then know that this ex is in town, is friendly with the party host, etc. You might just tell them to trust you, but what if they don’t?
Imagine you could just say to your associate “I could tell you why you shouldn’t go to the party, but then I’d have to kill you,” and they could reply “Prove it.” Both of your minds would then be copied and placed together into an isolated “box,” perhaps with access to some public or other info sources. Inside the box the copy of you would explain your reasons to the copy of them. When the conversation was done, the entire box would be erased, and the original two of you would just hear a single bit answer, “yes” or “no,” chosen by the copy of your associate.
Now, as usual, there are some complications. For example, the fact that you suggested using the box, as opposed to just revealing your secrets, could be a useful clue to them, as could the fact that you were willing to spend resources to use the box. If you requested access to unusual sources while in the box, that might give further clues.
If you let the box return more detail about their degree of confidence in their conclusion, or about how long the conversation took, your associate might use some of those extra bits to encode more of your secrets. And if the info sources accessed by those in the box used simple cacheing, outsiders might see which sources were easier to access afterward, and use that to infer which sources had been accessed from in the box, which might encode more relevant info. So you’d probably want to be careful to run the box for a standard time period, with unobservable access to standard wide sources, and to return only a one bit conclusion.
Inside the box, you might just reveal that you had committed in some way to hurt your associate if they didn’t return the answer you wanted. To avoid this problem, it might be usual practice to have an independent (and hard to hurt) judge also join you in the box, with the power to make the box return “void” if they suspected such threats were being made. To reduce the cost of using these boxes, you might have prediction markets on what such boxes would return if made, but only actually make them a small percentage of the time.
There may be further complications I haven’t thought of, but at the moment I’m more interested in how this ability might be used. In the world around you, who would be tempted to prove what this way?
For example, would you prove to work associates that your proposed compromise is politically sound, without revealing your private political info about who would support or oppose it? Prove to investigators that you do not hold stolen items by letting them look through your private stores? Prove to a date you’ve been thinking lots about them, by letting them watch a video of your recent activities? Prove to a jury of voters that you really just want to help them, by letting them watch all of your activities for the last few months? What else?
In general, this would seem to better enable self-deception. You could actually not know things anywhere in your head, but still act on them when they mattered a lot.
They are most likely doing it because they consider both copies the same person, and don't think killing one is murder. They'd consider it more like a memory wipe.
I'm reminded of a scene in HP:MoR, where a character noticed some similarities between memory wipes and murder. That may be a reference to this. I know Eliezer disagrees with Robin on this point.
> No, they aren't. The second law of thermodynamics forbids it.
The second law of thermodynamics doesn't work that way. In fact, it exists because the laws of physics are reversible. The information has to go somewhere, and if it gets too complex for you to keep track of, you call it "heat" and ignore it.
In fact, if you flip charge and parity, the laws of physics will run backwards. See https://en.wikipedia.org/wi...