We seem comfortable celebrating those who practice complex statistical analysis, even if only one in a thousand can do so. And the one in a billion genius celebrated for greatly improving our understanding or practice of statistics, or other rationality, seems to us an epitome of the best in humanity. Who would call such exemplars "inhuman"?
"I didn't say you had no information about the quality of your own judgment."
The crucial point is whether you have information about the relative quality of the judgments. If you have information about the absolute quality of your own judgment, that will imply information about the relative quality of the other person's judgment, given your priors. In theory you can then weight things accordingly.
It may be misleading to refer to what you advocate as conformism.It seems very similar to conformism on issues where most people behave as conformists.But on issues where people are irrationally polarized, your advice suggests people hold rather unusual beliefs. Someone who averages the beliefs of Christians, Muslims, Confucians, etc will look quite different from a conformist.
Neither: treat each judgment as equally likely to be right.
I didn't say you had no information about the quality of your own judgment.
Further, the tendency of human beings to rely on the judgments of others means that they're not independent any longer. Ten people agreeing on something as a group does not have the same weight as ten people agreeing on something as individuals.
"So if you have no information at all about the quality of the observers' judgment, how should you respond? By disregarding your own judgment and accepting theirs, or retaining yours and ignoring theirs?"
Neither: treat each judgment as equally likely to be right.
Caledonian - no-one's suggesting that you believe the word of others in defiance of your own experiential evidence, as well you know.
Actually, I am suggesting this, in some cases. If Caledonian sees a small purple unicorn that no one but he can perceive, I would suggest he ponder seriously whether the consensus view (that small purple unicorns do not exist) is correct, and he is hallucinating.
More generally, we should always keep in mind the large error factor in our own perceptual capacities (and even more starkly, the systematic defects in the reliability of our memories). Read some of the literature on the reliability of eyewitness identification testimony, and you will see what I mean. Unless you have reason to believe that you are a more reliable witness than other people, you should follow Robin's dictum even regarding experiential beliefs.
"Assuming that the differences in humans' beliefs are objectively trivial compared to the similarities, the remaining differences might be a diminishing reflection of rationality."
Except I'm still not convinced the assumption is coherent, let alone true. Maybe I need a rocking chair? I know I'm the one who first used the phrase here (to paraphrase your position, in the antecedent of a conditional), but I would contend that "objectively trivial" is a contradiction in terms.
"Maybe there are some biases we shouldn't try to overcome for the sake of rationality."
Probably there are benefits to believing false things, for those who to whom it really doesn't matter whether their beliefs are true or not, but I'm not in a position to care.
Davis, thank you for responding to my posts. I addressed you as "Mr." only because of the demographics of this community. I used it out of respect, and I apologize if you were offended. I shouldn't have made any assumptions. Please don't tell me if you're male or female, I'd rather not know.
I don't think my above posts are irrelevant to Robin's original post. Assuming that the differences in humans' beliefs are objectively trivial compared to the similarities, the remaining differences might be a diminishing reflection of rationality. Having one eye is unusual because most people have two, but it certainly isn't irrational to have only one. However, voluntarily popping an eye out is extremely irrational. People don't do this, we all agree that not popping our eyes out is a good idea, and this is a good reflection of our rationality.
The ways that people most commonly separate themselves are arguably not a matter of rationality in most circumstances. Lets say it is irrational to believe in God. Lots of people do it, and it forces them to exist in a world where basic scientific concepts are denied while other more complex scientific concepts are accepted and made use of every day. To deny evolution but accept cancer treatment... obviously irrational.
But maybe not. What if believing in God allows you to be part of a community that provides you with something that makes your life more comfortable. If you need social support or monetary support, religious affiliation can provide it. Is it irrational to sacrifice a rational belief to obtain something beneficial? If an immigrant who just moved to this country was faced with a choice of believing in evolution or sacrificing that belief to get some help learning English, setting up a small business, and meeting new friends in the community, I would argue that it is irrational to maintain the rational belief in evolution.
I can't think of any examples where popping out your eyeball is a good idea.
It's not obvious to me either that there is a well-defined, objective way to compare similarities and differences amongst people without reference to anything else. I simply propose sitting in a comfortable chair, preferably one that rocks, and thinking about it. I would argue that if the differences in humans' beliefs are objectively trivial compared to the similarities, then the usefulness in exploring these differences to gain insight into human rationality is trivial as well. I would also argue that rationality is not absolute and should be considered relative to the living situations of the people making decisions. If I believe that 1+2=4, and that belief clearly benefits me, then maybe it's not so irrational to believe. Maybe there are some biases we shouldn't try to overcome for the sake of rationality.
The symmetric corrollary of this is:"If you have no reason to believe that they have evidence that is worse than yours, you have no reason to accept your own conclusions above theirs, all else being equal."
So if you have no information at all about the quality of the observers' judgment, how should you respond? By disregarding your own judgment and accepting theirs, or retaining yours and ignoring theirs?
Michael, it would be fun to do a blog solely devoted to being critical of overcomingbias, with you and a few other consistently critical posters. It could be called "Overcoming Overcoming Bias" or something like that. :P As a plus, I think it would be more likely to draw the responses you (and I) would both like to see from Robin.
"If you have no reason to believe that they have evidence at least as good or better than yours, you have no reason to accept their conclusions above your own, all else being equal."
The symmetric corrollary of this is:
"If you have no reason to believe that they have evidence that is worse than yours, you have no reason to accept your own conclusions above theirs, all else being equal."
Which is precisely the point everyone else is making, and which you seem to insist on misinterpreting.
Caledonian - no-one's suggesting that you believe the word of others in defiance of your own experiential evidence, as well you know.
On the contrary, that is PRECISELY what the position expressed by some here requires. Which is why it's a ridiculous position.
Robin's suggestion is that when our data is incomplete, the apparent beliefs of others should have (at least some) weight in informing our decisions. Tell me exactly what your problem with this stance is.
If you have no reason to believe that they have evidence at least as good or better than yours, you have no reason to accept their conclusions above your own, all else being equal. If you have no reason to believe that they are more likely to reach a correct conclusion than you, you have no reason to accept their conclusions above your own, all else being equal.
If you're trying to evaluate your own conclusion-drawing ability, and you are using other people's abilities as a standard, you have reason to concern yourself with their evaluations. Otherwise, you may introduce bias.
When there are multiple variables involved, in order to evaluate one, we must hold all others constant.
Thanks Hal. I have proposed something very like that in the past as pointing very close to precisely in the direction in which I am trying to point when I say, for instance, that some song or painting is objectively better than some other song or painting, and I genuinely do try to utilize this principle when talking about art and the like, e.g. stating that I don't like Monty Python but based on the people who do they are very good. (hmm. looks like I said that in the comments of the linked post too) It seemed to me that "one is only entitled to use weighing functions which are attractors, e.g. which endorse themselves." was an expression of this principle, but apparently not an articulate one, and that the remainder of the paragraph was an exploration of potential problems with such an approach.
Robin: Maybe this should be a "disagreement case study". If you don't want to spend the time for that, what about just examining how we are better off, and how worse off, after transferring uncertainty from propositions to the weighing of other people's beliefs.
Caledonian - no-one's suggesting that you believe the word of others in defiance of your own experiential evidence, as well you know.
There are two extremes - flatly ignoring the views of others in all cases; and ignoring your own experience to believe everything you're told. No sane person would advocate either. Robin's suggestion is that when our data is incomplete, the apparent beliefs of others should have (at least some) weight in informing our decisions. Tell me exactly what your problem with this stance is.
Caledonian: They have huge incentives to make you think your spouse didn't cheat (P(they claim not to cheat|spouse is cheating) is high). Even if they didn't, what you see and hear is very strong evidence (P(spouse is cheating) is high).
Now, if your spouse has incentives to make you think (s)he cheated, and you only saw weak evidence (say, vaguely flirtatious behavior), would you still doubt their claims? I don't think so.
"Who are you going to believe: me, or your lying eyes?"
When we're trying to learn about a phenomenon, our observations of that phenomenon are more valuable than other people's conclusions about those observations. This is not a matter of degree, but one of kind - there is a hierarchy of evidence in which things in themselves have a deeper priority than things we believe about things.
Thus, when we walk in upon our spouse cheating with a stranger, we are inclined to trust our senses and our interpretation of their data over the words of our spouse and the stranger.
What's that? We're taking our own judgement over the judgement of two people? Clearly, we must have left rationality behind... for a decidedly non-standard definition of 'rationality'.
"I didn't say you had no information about the quality of your own judgment."
The crucial point is whether you have information about the relative quality of the judgments. If you have information about the absolute quality of your own judgment, that will imply information about the relative quality of the other person's judgment, given your priors. In theory you can then weight things accordingly.
It may be misleading to refer to what you advocate as conformism.It seems very similar to conformism on issues where most people behave as conformists.But on issues where people are irrationally polarized, your advice suggests people hold rather unusual beliefs. Someone who averages the beliefs of Christians, Muslims, Confucians, etc will look quite different from a conformist.
Neither: treat each judgment as equally likely to be right.
I didn't say you had no information about the quality of your own judgment.
Further, the tendency of human beings to rely on the judgments of others means that they're not independent any longer. Ten people agreeing on something as a group does not have the same weight as ten people agreeing on something as individuals.
"So if you have no information at all about the quality of the observers' judgment, how should you respond? By disregarding your own judgment and accepting theirs, or retaining yours and ignoring theirs?"
Neither: treat each judgment as equally likely to be right.
Caledonian - no-one's suggesting that you believe the word of others in defiance of your own experiential evidence, as well you know.
Actually, I am suggesting this, in some cases. If Caledonian sees a small purple unicorn that no one but he can perceive, I would suggest he ponder seriously whether the consensus view (that small purple unicorns do not exist) is correct, and he is hallucinating.
More generally, we should always keep in mind the large error factor in our own perceptual capacities (and even more starkly, the systematic defects in the reliability of our memories). Read some of the literature on the reliability of eyewitness identification testimony, and you will see what I mean. Unless you have reason to believe that you are a more reliable witness than other people, you should follow Robin's dictum even regarding experiential beliefs.
"Assuming that the differences in humans' beliefs are objectively trivial compared to the similarities, the remaining differences might be a diminishing reflection of rationality."
Except I'm still not convinced the assumption is coherent, let alone true. Maybe I need a rocking chair? I know I'm the one who first used the phrase here (to paraphrase your position, in the antecedent of a conditional), but I would contend that "objectively trivial" is a contradiction in terms.
"Maybe there are some biases we shouldn't try to overcome for the sake of rationality."
Probably there are benefits to believing false things, for those who to whom it really doesn't matter whether their beliefs are true or not, but I'm not in a position to care.
Davis, thank you for responding to my posts. I addressed you as "Mr." only because of the demographics of this community. I used it out of respect, and I apologize if you were offended. I shouldn't have made any assumptions. Please don't tell me if you're male or female, I'd rather not know.
I don't think my above posts are irrelevant to Robin's original post. Assuming that the differences in humans' beliefs are objectively trivial compared to the similarities, the remaining differences might be a diminishing reflection of rationality. Having one eye is unusual because most people have two, but it certainly isn't irrational to have only one. However, voluntarily popping an eye out is extremely irrational. People don't do this, we all agree that not popping our eyes out is a good idea, and this is a good reflection of our rationality.
The ways that people most commonly separate themselves are arguably not a matter of rationality in most circumstances. Lets say it is irrational to believe in God. Lots of people do it, and it forces them to exist in a world where basic scientific concepts are denied while other more complex scientific concepts are accepted and made use of every day. To deny evolution but accept cancer treatment... obviously irrational.
But maybe not. What if believing in God allows you to be part of a community that provides you with something that makes your life more comfortable. If you need social support or monetary support, religious affiliation can provide it. Is it irrational to sacrifice a rational belief to obtain something beneficial? If an immigrant who just moved to this country was faced with a choice of believing in evolution or sacrificing that belief to get some help learning English, setting up a small business, and meeting new friends in the community, I would argue that it is irrational to maintain the rational belief in evolution.
I can't think of any examples where popping out your eyeball is a good idea.
It's not obvious to me either that there is a well-defined, objective way to compare similarities and differences amongst people without reference to anything else. I simply propose sitting in a comfortable chair, preferably one that rocks, and thinking about it. I would argue that if the differences in humans' beliefs are objectively trivial compared to the similarities, then the usefulness in exploring these differences to gain insight into human rationality is trivial as well. I would also argue that rationality is not absolute and should be considered relative to the living situations of the people making decisions. If I believe that 1+2=4, and that belief clearly benefits me, then maybe it's not so irrational to believe. Maybe there are some biases we shouldn't try to overcome for the sake of rationality.
"So if you have no information at all [...]"
Use an ignorance prior?
The symmetric corrollary of this is:"If you have no reason to believe that they have evidence that is worse than yours, you have no reason to accept your own conclusions above theirs, all else being equal."
So if you have no information at all about the quality of the observers' judgment, how should you respond? By disregarding your own judgment and accepting theirs, or retaining yours and ignoring theirs?
Michael, it would be fun to do a blog solely devoted to being critical of overcomingbias, with you and a few other consistently critical posters. It could be called "Overcoming Overcoming Bias" or something like that. :P As a plus, I think it would be more likely to draw the responses you (and I) would both like to see from Robin.
"If you have no reason to believe that they have evidence at least as good or better than yours, you have no reason to accept their conclusions above your own, all else being equal."
The symmetric corrollary of this is:
"If you have no reason to believe that they have evidence that is worse than yours, you have no reason to accept your own conclusions above theirs, all else being equal."
Which is precisely the point everyone else is making, and which you seem to insist on misinterpreting.
Caledonian - no-one's suggesting that you believe the word of others in defiance of your own experiential evidence, as well you know.
On the contrary, that is PRECISELY what the position expressed by some here requires. Which is why it's a ridiculous position.
Robin's suggestion is that when our data is incomplete, the apparent beliefs of others should have (at least some) weight in informing our decisions. Tell me exactly what your problem with this stance is.
If you have no reason to believe that they have evidence at least as good or better than yours, you have no reason to accept their conclusions above your own, all else being equal. If you have no reason to believe that they are more likely to reach a correct conclusion than you, you have no reason to accept their conclusions above your own, all else being equal.
If you're trying to evaluate your own conclusion-drawing ability, and you are using other people's abilities as a standard, you have reason to concern yourself with their evaluations. Otherwise, you may introduce bias.
When there are multiple variables involved, in order to evaluate one, we must hold all others constant.
Thanks Hal. I have proposed something very like that in the past as pointing very close to precisely in the direction in which I am trying to point when I say, for instance, that some song or painting is objectively better than some other song or painting, and I genuinely do try to utilize this principle when talking about art and the like, e.g. stating that I don't like Monty Python but based on the people who do they are very good. (hmm. looks like I said that in the comments of the linked post too) It seemed to me that "one is only entitled to use weighing functions which are attractors, e.g. which endorse themselves." was an expression of this principle, but apparently not an articulate one, and that the remainder of the paragraph was an exploration of potential problems with such an approach.
Robin: Maybe this should be a "disagreement case study". If you don't want to spend the time for that, what about just examining how we are better off, and how worse off, after transferring uncertainty from propositions to the weighing of other people's beliefs.
Caledonian - no-one's suggesting that you believe the word of others in defiance of your own experiential evidence, as well you know.
There are two extremes - flatly ignoring the views of others in all cases; and ignoring your own experience to believe everything you're told. No sane person would advocate either. Robin's suggestion is that when our data is incomplete, the apparent beliefs of others should have (at least some) weight in informing our decisions. Tell me exactly what your problem with this stance is.
Caledonian: They have huge incentives to make you think your spouse didn't cheat (P(they claim not to cheat|spouse is cheating) is high). Even if they didn't, what you see and hear is very strong evidence (P(spouse is cheating) is high).
Now, if your spouse has incentives to make you think (s)he cheated, and you only saw weak evidence (say, vaguely flirtatious behavior), would you still doubt their claims? I don't think so.
"Who are you going to believe: me, or your lying eyes?"
When we're trying to learn about a phenomenon, our observations of that phenomenon are more valuable than other people's conclusions about those observations. This is not a matter of degree, but one of kind - there is a hierarchy of evidence in which things in themselves have a deeper priority than things we believe about things.
Thus, when we walk in upon our spouse cheating with a stranger, we are inclined to trust our senses and our interpretation of their data over the words of our spouse and the stranger.
What's that? We're taking our own judgement over the judgement of two people? Clearly, we must have left rationality behind... for a decidedly non-standard definition of 'rationality'.