25 Comments

Ideal rational agents with common priors should never have common knowledge of disagreement.

As I pointed out further up the page, such agents must also have truth-seeking as their top priority for this to hold. If they have other goals, they can easily find themselves with irreconcilable differences.

You can surely be rational and not have truth seeking as your primary goal. Rationality and goals are totally orthogonal things - at least in my book. Does the repeated occurrence of this curious idea mean that people are mixing these concepts together?

The fact that humans persistently have common knowledge of disagreements indicates that something is very wrong.

It indicates that humans do not have truth-seeking as their primary goal. Of course, evolutionary theory suggests that agents with truth-seeking as their primary goal can be expected to be rare - so this hardly seems like news to me.

Expand full comment

I've written a number of posts on disagreement myself, but I think that most reasonable parties who've been keeping track of the debate, at this point, should confess the following:

1) Ideal rational agents with common priors should never have common knowledge of disagreement.

2) In the real world, two sane rationalists with common knowledge of each other's sanity should not have common knowledge of disagreement. ("Sane" here is a variable that ranges over different definitions of sanity, but it excludes e.g. priors too crazy to reflect on their own causal origins.)

3) The fact that humans persistently have common knowledge of disagreements indicates that something is very wrong.

4) (3) shows that humans systematically overestimate their own meta-rationality, that is, ability to judge whether others are more or less rational than themselves.

5) ...and that, in a lot of cases, Disagreements Aren't About Belief.

It's where we start talking about practical remedies for this dreadful, dreadful situation, that I think we begin entering into the area of - ahem - reasonable disagreement. I don't think the debate has settled the question of what to do when you find yourself disagreeing.

Expand full comment

A few quotes from Darwin's Cathedral - by David Sloan Wilson:

Rationality is not the gold standard against which all other forms of thought are to be judged. Adaptation is the gold standard against which rationality must be judged, along with all other forms of thought. [...]If there is a trade-off between the two forms of realism, such that our beliefs can become more adaptive only by becoming factually less true, then factual realism will be the loser every time. [...]Factual realists detached from practical reality were not among our ancestors. It is the person who elevates factual truth above practical truth who must be accused of mental weakness from an evolutionary perspective. [...]

It is only when a pair of factual truth seekers meet that they can't disagree for long - and we can't expect to find many such seekers left in this modern era.

Expand full comment

Disagreement Debate Status?

Image by Getty Images via DaylifeRobin Hanson asks about the very prospect of rational disagreement, a topic of much practical interest to mediators and negotiators....

Expand full comment

Robin, what do you do when confronted with people and arguments you disagree with?

Trying to "never agree to disagree" is pointless in a world where disagreements are so entrenched; the correct question is how do we behave in practice, while knowing that in theory we should "never agree to disagree".

I've boiled down most of these posts on disagreements to a greater humility towards my certainties, and a few rules of thumb to keep in mind during arguments. Should I expect to get more out them, in practice?

Expand full comment

Matt: Can you think of any cases where NOT being a truth-seeker is preferred (better get us what we want)?If you can, how truth-seeking should we be?

What if seeking the truth (or resolving a disagreement) is more costly than living with the disagreement/error? In that case we should focus our truth seeking on cases where:

1) An error would be costly and/or truth-seeking is cheap2) We can significantly affect the outcome.

Expand full comment

It is my understanding from following some of threads that the formal model which gives currency to Robin's views is Aumman's Formal Theory of Common Knowledge, which I understand Robin has expanded on.

Before entering this debate, I should like to know the ground rules about formal models accepted by this community.

Is there a general acceptance that a formal model of reasoning is a:

a) a truth functional;b) translation of the natural language to a formal language;c) which preserves a set of inferences or inference in the natural language.

Expand full comment

I give courtship as an example of when it can pay to believe untruths here:

It is generally in a man's genetic interests to maximise his number of descendants by maximising the number of his immediate offspring - by techniques such as impregnating as many females as possible, and skimping on parental care of offspring.However, this is not something prospective mates are particularly keen to hear from males. Instead females prize traits such as fidelity. They generally prefer monogomous relationships, which allow the most scope for males offering parental care.Consequently males interested in pusuing this sort of strategy (which evolutionary theory suggests are most males) are put into a position where they have to deceive their prospective mates about their intentions.Hamilton suggests that they may do this by employing double-think - actually believing themselves to be whatever the females desire them to be - while not necessarily acting according to those beliefs.

It often pays to believe untruths when others in the rest of society act favourably towards believers - because lying convincingly is difficult for humans.

Expand full comment

Forgot to say: sorry for the long post.

Expand full comment

What I'm going to say is perhaps a little off-topic, since I'm not going to address the whole issue of disagreement, but rather the issue of majoritarianism. Perhaps I have misunderstood this idea, or not, but in any case I have many doubts about its usefulness. It seems to me that "agreeing with the majority unless you have a powerful reason to disagree" is an idea which makes sense, but we won't have many chances of putting it into practice since the exception, having a powerful reason to disagree, is something that will happen almost always on most important topics. I have read several posts in this blog that have convinced me that the majority is biased (on economics, policy...), and that most people actually have the same biases, so they don't average out but rather add up.

I don't see how meta-majoritarianism would fix this, it just seems to me that someone said (I don't know who came up with the idea of meta-majoritarianism) "oh well, majoritarianism won't work, so let's try and fix it by making it 'meta'". But we are as biased when we judge other people's abilities to judge issues as we are when we judge the issues directly.

The thing is, a perfectly unbiased rationalist doesn't need majoritarianism, but I haven't found any, so the idea might still be useful, if someone finds out how to fix it. I have an idea, which might not be a good idea, but what the heck, I'll just tell you about it. I'd call it majoritarianism mixed with "overcoming bias for the people". That is, if more people would make this effort some of us make to overcome bias, even if it was to a lesser extent, the opinion of the majority would be worth considering. It would require rationalist education, or rationalist proselitism (however you wish to call it) instead of what seems to me an elite of rationalist truth-seekers. I'm not saying everyone has to be an earnest truth-seeker, just that if people are aware of the more common biases, other biases might well cancel out instead of adding up.

Of course I can think of some risks. The main one is the fact that Eliezer pointed out a while ago, that some rationalism might in fact be worse than no rationalism at all. I can imagine what a population of "clever arguers" would look like. In the best case it would be irritating, in the worst it would be disaster. Also, I recall Eliezer discussing whether bias can make people happy, and if I recall correctly his main conclusion was: oh well, you really have no choice once you have started seeking truth, since you can't be selectively rational. Well, if my idea of rationalist proselitism were to be seriously considered, the issue would need a second thought. Maybe we don't have the option of being selectively rational, but we can let other people be as irrational as they like, if that makes them happy.

Expand full comment

It's clear that rational agents with common priors (or in many cases with similar reasonable priors) will not disagree.

No it isn't. The agents must also have truth-seeking as their top priority. If they have other goals, they can easily find themselves with irreconcilable differences.

Expand full comment

Caledonian,

Can you think of any cases where NOT being a truth-seeker is preferred (better get us what we want)?If you can, how truth-seeking should we be?

Expand full comment

Robin,

I would like to see some bloggingheads.tv debates on disagreement between you and someone you disagree with very much.

Expand full comment

"So what do folks think is the status of the debate on the rationality of disagreement?"

What debate? It's clear that rational agents with common priors (or in many cases with similar reasonable priors) will not disagree. Therefore, any disagreements are due to non-rational agents or unreasonable priors.

Which describes humans pretty well.

Expand full comment

Tim, yes of course, but the question is when you can have reasons to think you care about truth more than others, relative to your interests. There's part of the problem, right there -- caring about the truth is an absolute, not a relative.

Very few people value possessing and stating the truth more than maintaining political power or being able to think well of themselves. The question we must ask is not "whether we care about truth more than the next guy", but whether we care about truth as anything more than merely instrumentally, as a way of getting the things we really care about.

Expand full comment

In biology, organisms may be expected to be interested in making copies of their genes. Seeking the truth might happen as a by-product of that - but we shouldn't expect to find many organisms prioritising it that highly.

Expand full comment