The honesty of Hal's scenario depends on whether "privately believing" means that you haven't also publicly indicated that belief.

Expand full comment

Hal Finney: You'd be pushing a strong position while privately believing that the other side may well be right. As long as you're honest about it, that seems fine.

Honest as the term is used colloquially, but not as required for Aumann's theorem. In general people don't express their beliefs as honestly assessed probabilities; the natural human tendency is to argue for the belief which you want the other person to believe more strongly in, while hiding doubts. Note that it is rational to discount the expressed opinions of someone who may be "playing devil's advocate". I suspect that in practice this may be as much a barrier to the real world applicability of Aumann's theorem as human irrationality.

Of course, I myself do play devil's advocate sometimes, and you should be less convinced by my opinions because of this.

Expand full comment

I've been out of touch and I'm sorry to come late to this discussion. I proposed a similar thought experiment a couple of years ago, and here is a version of what I said then:

Here's a thought experiment. Suppose you were duplicated, and you and your copy went out and independently worked and did research on X. You come back together after a year, and you simultaneously report your estimates on whether a given X-related project will succeed. One of you says the odds are 1/10, and the other says the odds are 9/10.

I would think, in this situation, that at least one of you would be extremely surprised and shocked. You would be forced to accept the fact that your copy had accumulated evidence during that year which led him to a very different estimate than your own. And the mere knowledge of that difference, even before you begin talking about the details of what you both learned, will be enough to sharply change your internal estimate of the probability.

You might even be struck sufficiently by the symmetry of the situation to get past the automatic assumption that you are more likely to be right than your copy. You would understand that it was really only an accident of chance which copy your consciousness was in, that you could just as easily have been the other one, in which case you would hold the opposite view about probability.

Suppose your experience was that the strength of the evidence you accumulated during the year had not been as strong as you hoped and expected it would be, and therefore it is likely that your copy had better luck than you did in terms of the strength of his evidence. Given the symmetry which otherwise exists, you would then prefer to adopt its position as your new estimate. OTOH if the year had been unusually productive and strong in the quality of evidence it gave you for your beliefs, you would choose not to switch. Since you know nothing about what the copy's experiences were along these lines, you can't predict whether he will switch or not; it will depend solely on whether the strength and quality of the evidence he accumulated during the year was above or below average, which is not known to you.

Even if you do follow these predicted behaviors, you would still be right to argue (or at least vigorously discuss) the reasons why your copy came to such a dramatically different result than you did. You want to pool your information and come up with the best quality estimate based on everything you two learned during the year. And it may well be that the best way to do that is for the one to defend the 0.1 estimate while the other defends 0.9, each based on what they learned.

While doing this, though, I think mentally you would each be somewhat in the frame of mind of playing devil's advocate. You'd be pushing a strong position while privately believing that the other side may well be right. As long as you're honest about it, that seems fine. I think this is the mental stance that Bayesians would have to hold while arguing and disagreeing with each other, and such a practice seems plausible to be the optimal method for consolidating information.

Expand full comment

"Everyone, it is too easy to say you would agree after you considered all possible relevant material, or after you had made sure both of you had seen exactly the same material. That avoids the harder and more interesting question, since in practice we rarely have time to look at all relevant or exactly the same material as those we disagree with."

If all I have to go on is the claim, I have to reconsider my estimates of my conclusion. I don't have good reasons to believe that my alter-ego has *better* evidence than I do, so my new estimate goes back to no opinion with regard to the question, and presumably my alter-ego will make the same decision. No matter how strong the evidence that I have seen, it is possible for it to be filtered and for him to have an equivalent body of countervailing evidence. I suppose if my evidence had been relatively weak, then I could be convinced by my alter-ego simply because of his conviction, assuming I could be reasonably assured that he hadn't been brainwashed or altered in some way to make him more susceptible to poor argument.

If I am strongly convinced by the evidence that I've seen, given that I should already (as Douglas K and Carl S. point out) have discounted the evidence somewhat given its purview and the lack of counterevidence, then the farthest I can probably get is back to tabula rasa, so we'll probably end up there, unless one of us feels his set of evidence is significantly weaker than the other does, or one of us shows the other signs of suspect mental alteration.

Expand full comment

Dagon, if you think there is a small chance they didn't understand you, or that you didn't understand them, that justifies only a small disagreement - the effect is linear.

Expand full comment

It's the "clear what it meant" and "clear that they meant it" that's tripping me up. I know of very few persistent disagreements where those hold. For non-trivial claims, it's impossible to be clear exactly what someone means, and even more impossible to be clear that they mean it.

Something as simple as "I think it will rain tomorrow" is questionable. Even if you try to be insanely explicit: "I think there's more than a 42% chance that there will be measurable precipitation somewhere in the area", you STILL don't know if that's what the speaker really means. What he really means may be "I want credit for warning you about whatever weather occurs".

For more complicated claims, like which side was responsible for a war, it's completely impossible to know what anyone means with enough exactness and trust that one should give their opinion as much weight as your own. Both sides are likely biased, but I have more data about the extent and honesty of my own claim.

This is not to say that contradiction isn't strong evidence that I should examine my position and in many cases change my convictions. All I'm saying is there's an upper bound to the information quality coming from other people, and this leaves us open to rational disagreement.

Expand full comment

Dagon, all we need is that it be clear that someone said something, clear what it meant, and clear that they meant it.

Expand full comment

I think the term "fact" is confusing when discussing this topic. Almost no topic of disagreement that I have with respected peers, which my double would be, are over verifiable, simple facts in the vernacular sense of the word.

I get the impression that technical use of the word "fact" is much broader, and includes probability estimates of future events, recommended strategies, etc. I _DO_ disagree with peers over some of these, and I expect I'd disagree with my double. I further expect that the reasons for disagreement would be the same: I cannot trust the communication of beliefs. There are belief biases, which can be taken to be the same in me and my double, but there are also signalling and communication biases, which affect claims made by non-me differently than those made by me.

I don't understand how trust is not required. All the models I've seen use phrases like "reliably informed" or otherwise imply that knowledge about someone's contradictory belief is actual knowledge about the contradiction, rather than suspect knowledge inferred from misleading signals.

Expand full comment

Well, (ideally, as I said,) the neutrality would be there regardless of our knowledge of either the duplication itself or the future meeting. The neutrality is something we should practice all the time.

Expand full comment

Pdf, I never said you and he were told that you would meet in a day to compare notes; that could have been a surprise.

Expand full comment

To expand on Douglas's comment, to expect that one will be convinced by a certain argument more than the argument will actually warrant (for example, to expect that after hearing the opposite side's argument one will be less certain about the position) is the same as saying that one is not discounting the evidence of the argument enough to compensate for its polemical nature. If one can predict now that one's opinion will be changed in a certain direction in the future, one should change it in that direction right now. If I have read one side's argument and estimate that my opinion will shift by X amount after reading the other side's, I should go ahead and shift it by X. I can still expect that my opinion will change after reading the other side's argument, but not in which direction my opinion would shift. If I understand it correctly, this follows directly from Aumann's theorem.

(I actually made this same mistake on an earlier thread.)

Ideally, my duplicate and I would both be mostly neutral about the dispute, since we recognized the one-sided nature of the information we learned during the day and withheld judgment for lack of information from the other point of view. Then, upon meeting, we'd both defend what we saw the previous day, but only as an efficient way of exchanging information about arguments, not because we actually held the respective positions.

I think that some (but not much) real life discourse is actually like this. People will defend a position directly that they don't actually espouse in order to get the most information quickly out of someone with an opposing position, without actually believing most of what they're defending. They might even talk about it as if it were they're actual beliefs. How common is this? Probably not very.

Expand full comment

Michael Sullivan:it's reasonable to think that we will tend to believe the side whose arguments we read

It may be reasonable self-knowledge, but it is not knowledge of a reasonable self. Acting as you predict is an error and we should try to avoid it. Looking for "obvious bias or hackery" is not enough.

If you're saying "reasonable self-knowledge," that it's difficult to evaluate polemics, then I agree with you, but it sounds like you're giving up too soon.

Expand full comment

Dagon, follow the link "we can't foresee to disagree"; perfect communication and trust are not required. And since you are estimating the same matter of fact, I don't see how a bias could be good for him and bad for you.

Expand full comment

Why would you expect to have perfect communication and trust with your double? I wouldn't expect to agree on all topics after some amount of divergent inputs any more than I would with any other human.

He wants something from me, and I want something from him (at the very least, we want not to be attacked, and probably want cooperation on other fronts). This puts an upper bound on how much trust we can each give the others' claims.

For a recent double, it's likely that I have more insight into his motivations and am somewhat better at determining his true beliefs than I am a random person, but it's still not perfect, or even all that good.

It seems perfectly reasonable to believe that my biases are equally likely to be truthful as his, and benefit me more than him, so I should weigh them more heavily (after doing what I can to identify and discard biases on both sides, and to share evidence in both directions).

Expand full comment

I agree, you should come to agreement with your duplicate even when materials and reasons cannot be exchanged. However, if you are repeatedly exchanging conclusions, then their path should be a random walk until agreement is reached, while a powerful new bias is disproportionately likely to lead to no movement. If your duplicate's opinion does not change at all over several exchanges of opinion in which you shift your estimates closer to his or hers, the probability of a mental problem becomes more worrisome. The evidentiary impact of this depends on the ratio between your prior probabilities of mental defect in your duplicate and of the other side possessing extraordinarily powerful evidence that would justify stability of opinion over a long exchange.

Expand full comment

Everyone, it is too easy to say you would agree after you considered all possible relevant material, or after you had made sure both of you had seen exactly the same material. That avoids the harder and more interesting question, since in practice we rarely have time to look at all relevant or exactly the same material as those we disagree with.

Expand full comment