The usual party chat rule says to not spend too long on any one topic, but instead to flit among topics unpredictably. Many thinkers also seem to follow a rule where if they think about a topic and then write up an opinion, they are done and don’t need to ever revisit the topic again. In contrast, I have great patience for returning again and again to the most important topics, even if they seem crazy hard. And for spending a lot time on each topic, even if I’m at a party.
Right - except that the differing goals could be genetic and do not need to have anything to do with social behavior. The assumptions the disagreement theory Robin refers to rest on shared goals. However people don't necessarily have shared goals. In practice, their goals conflict. Alice might favor donating funds to her daughter, Ann, while Betty might favor the same funds being given to her son, Bob. Even two "truth seekers" can have conflicting goals. One truth seeker might value the truth about physics, and another truth seeker might value the truth about chemistry more highly. Goal differences of any kind lead down the same path towards disagreement.
"because their social world rewards them" above would seem to fit what you're talking about.
All this only makes sense if you actually mean "rational, truth-seeking agents". However you do not say that here - you just say "rational". "Rational" also commonly refers to instrumental rationality, which is the more common usage of the term, I think.
Instrumentally rational agents can disagree about all kinds of things. They don't necessarily truthfully express their beliefs, since they can often gain more by using their beliefs for signalling purposes - in order to manipulate others. Take the signalling of beliefs by tobacco company executives about the harmfulness of cirarettes, for instance. That makes no sense under the hypothesis that they are rational, truth-seeking agents - but instrumental rationality explains it easily.
You characterize deviations from truth-seeking - such as instrumental rationality - as a type of "broken mind". However, standard biological theory predicts that most evolved agents will behave like this. Rational, truth-seeking agents are surely the exception. Few evolved agents will overcome billions of years of evolution rewarding their ancestors for reproducing and replace that primary goal with truth seeking. As an unstated premise, thiat is a pretty wild assumption.
I was not trying to prove God with that point, I was describing how a "God conclusion" is going to be a common outcome of thinking on the topic long enough.
With regard to "believing in the greatest", did you read the next paragraph?
Anyway, to further exp[lain, the argument is that to be the greatest it must be believed, otherwise it's not the greatest. Imagine a god that you don't believe in, and then imagine the same god but changed enough for you to believe in it - that second god is greater than the first.
The point I made there is that that type of thinking could be viewed as a logical trap of a sort - people will keep thinking until they come up with something they can believe in, and that thing is likely to be "the greatest" thing they can imagine - a god like thing if not the traditional God. This line of thought has always existed and is probably a more likely reason for a wrong belief in God to exist over thousands of years than social desirability which Robin posited.
But you can also take the argument in other ways - why does thought have a tendency to reach outcomes like God but without being able to prove it?
With regard to the "bald assumptions", it's a big topic. I briefly looked for a summary link for you, but I didn't find anything that neatly sums it up. You can try the following:
Basic about science: https://undsci.berkeley.edu...
Limits of logic:https://en.wikipedia.org/wi...
There are a lot of books and philosophical articles written about the limits of science and logic. I haven't made up these "bald assumptions".
There are plenty of things that aren't known yet, but will be found out, but not everything falls into that category. There are also things that will never be known, including true things that can't be proven true, false things that can't be proven false. These things include existential/ultimate things for some reason, but not always. See the following list for some examples: https://en.wikipedia.org/wi...
You could have cited Anselm (1033-1109). His "ontological proof" is nearly 1000 years old (but only proving, in my humble opinion, that old ideas are not necessarily good ones). See, among others, https://iep.utm.edu/ont-arg/
if you don't believe, you haven't actually imagined the greatest, because you would believe in something that was the greatest if you are reasonably smart.
I don't understand. Why would I necessarily beleive in "something that was the greatest"?
There can be no scientific or logical understanding within the universe of why the universe exists, and of many other similar things, such as what happens when you die. There are true things beyond logical proof, and things beyond science.
Those seem to be a set of bald assertions without evidence or argument behind them. How do we know that we can't know these things? We know lots of things today that smart people didn't know 200 years ago.
Is there some reason we can't say "these are things we don't know - now" but leave open the possibility that we may learn these things in the future?
Let me add that, while I agree (with the qualifications below) of the general thrust of your point I think those who end up overly contrarian are applying the exact same kind of heuristic. It's just that the fact that they have contrarian beliefs means that, to them, it looks like people's beliefs end up twisted on those kind of issues.
After all, how does one gather evidence that a particular topic is the sort of thing that is very good at twisting other minds? Well, you evaluate those kinds of beliefs and see if they are often false. And yes, you can avoid direct circularity by looking at other similar beliefs (e.g. look at all religious beliefs not just this particular one) but your judgement of what's a similar belief is very tied up with your judgement of how that belief relates to the truth.
I mean, even conspiracy theorists can give you a theory about why people's views are twisted on the issues that they are conspiracy theorists about so I'm not sure that this is going to practically be able to improve many people's beliefs.
I don't think that answer is completely right. Surely you need to consider both the strength of your argument for the proposition in question as well as the evidence you have that other people's minds get broken. If you only have very weak evidence for your contrarian object level claim then you probably ought to avoid accepting it unless you have really strong evidence that the topic twists other people's minds.
OTOH, if you find everyone denying that A and A -> B entails B then even without a strong argument that the topic warps the minds of others you may still want to accept it (of course, in the real world, there is always the possibility that there has been a miscommunication as well).
Also, surely it matters whether you are disagreeing over the analysis that others offer or disagreeing about what priors to assign. Yes, in many real world cases those can be hard to disentangle but in those cases where it's clearly a difference in priors (not a disagreement about evidence/experience/analysis) surely you need less evidence that the subject twists other minds.
Plenty of the smartest people have believed in God - Newton, Leibniz, Einstein, Goedel for a few examples. And these people thought long and hard about it.
If you don't believe in God, it's because your conception of God is not satisfying to you. God is by definition "the greatest", and if you don't believe, you haven't actually imagined the greatest, because you would believe in something that was the greatest if you are reasonably smart. If you keep thinking, you could come up with an understanding of God that is satisfying to you.
That previous paragraph can be seen as a way that large numbers of people might come to believe in God wrongly, and so a reason to doubt God by your "new take". It can also be taken at face value as a reason to keep thinking about God until you believe.
Many (most?) religious people acknowledge that they have a "faith" because they understand that there can be no scientific or logical understanding within the universe of why the universe exists, and of many other similar things, such as what happens when you die. There are true things beyond logical proof, and things beyond science.
People of faith believe things they think work in their life and in society. Everyone does in the end, it's just that most people of faith know that what they believe is faith, and know that what they believe has a long history of working reasonably well for many people over a long time, while others think someone somewhere has or likely will prove what they believe despite there being little or no history of what they believe working over any length of time.
You say social desirability is a point against an idea because it's a reason many people might be wrong or accept something without thinking too hard about it. I don't think God has been the beneficiary of social desirability for decades - not in most of the western world anyway, nor probably in the East either. If the social desirability is gone or changes, is the minority that was previously the majority now more likely to be correct?