What MV said. Caledonian, surely the truly rational don't waste time worrying about whether or not they're sane?
Bryan seems to mistakenly conclude that one is more justified in relying on one's own direct evaluation of arguments, relative to the evaluations of others, on topics where people tend to be more irrational.
But he doesn't suggest this as a piece of general advice (i.e. he would say 'I' or 'economists' rather than 'one'.)
If you (believe that you) have evidence that you're more rational than the average person, his is excellent advice. I do, Caplan does, everyone here does if they're honest. If you believe you're below average rationality, do the opposite. Sounds fine to me.
Image via Wikipedia One of the most interesting and challenging logical phenomena discovered in the 20th century was the rigorous study of the relationship, for...
"I do think far too many irrational people have a subjective perception that they have chosen to be rational for that be more than a rather weak guide to one's own rationality."
What does that matter to a rational person? No number of suicide rocks is sufficient to make me worry that I am one. Ditto creationists or astrology believers. In so far as I am rational, I "should" believe that I am rational as that belief will cause me to win. If I'm not rational I won't do what I "should" in any event so the point is moot.
Those who fall into madness believe they're finally becoming sane. Those who fall into sanity believe it's possible that they may be sane, but fear and doubt their own sanity.
Then fearing and doubting sanity is (not entirely communicable) evidence for sanity.
Eliezer, in this post I didn't address the issue of how modest to be overall regarding your vs. others' evaluations. Instead I said confidence should be less but modesty should not be less for low-material-cost topics.
I find myself leaning toward Caplan. I too am suspicious of modesty on multiple grounds: it's easy to endorse modesty and then not change any of your opinions, it's a general-purpose counterargument, etc.
If Caplan has identified specific biases and compensated for all specific biases he knows, then he is justified in believing his beliefs to be above average. As for the possibility of remaining biases, he should adjust his probabilities in such a direction that he expects the discovery of remaining biases, like the discovery of remaining unknown unknowns, to shift his beliefs the same amount in either direction on average.
I'm also sympathetic to pleas not to lose sight of first-order object-level arguments in trying to adjust for just the right amount of modesty with respect to a crowd that one has just demonstrated to be individually and collectively insane in various specific ways. All judgments begin at the object level and its straight-out first-order verdict should never be neglected.
"Remember that limited expertise is better than none at all."
For some reason, I feel compelled to point out counterexamples; consider the history of medicine in Europe before, say, 1800. Doctors were more likely to hurt you than help you.
People have certain privileged knowledge about their own rationality that they don't have about that of other people or of suicide rock.
No, we don't. One of the first things rational people realize is that they cannot demonstrate their own rationality to themselves.
Those who fall into madness believe they're finally becoming sane. Those who fall into sanity believe it's possible that they may be sane, but fear and doubt their own sanity.
Robin, I think you may be confusing a heuristic with a causal relationship. Or maybe this is something that looks different from the outside than from the inside.
Imagine I make a testable prediction, and also wager a large sum of money that my prediction is correct. The fact that I have made this large wager might increase your confidence that my prediction is correct, and certainly should increase your estimation of my own confidence in my prediction. But I cannot increase my own confidence that my prediction was correct by betting on it.
The real problem with the objection is that no sensible person would feel that the author's professed confidence that his ideas are correct is a good reason for the reader to believe that those ideas are correct in the first place. Before reading the book, it's perfectly reasonable to suspect that the author has more confidence in his ideas than the facts merit. Most authors do. But it would be absurd to argue in a review of the book, "this book probably sucks, since most books do".
Michael, I did not claim the non-existence of individual or collective rationality. I do think far too many irrational people have a subjective perception that they have chosen to be rational for that be more than a rather weak guide to one's own rationality.
Science history is a massive list of existence proofs for individual and collective (and collective without cherry-picking) rationality in some important practically relevant sense in impractical domains.
People have certain privileged knowledge about their own rationality that they don't have about that of other people or of suicide rock.
From a personal perspective, one sensible approach can be to observe that people can largely be rational in the practically relevant sense when they choose to be and then to take the approach of saying that either I am choosing to be rational in the practically relevant sense or I am choosing not to be rational and in either case I will accomplish my goals better by claiming that I am being rational. Taking care to be rational may be justified (because doing so is what choosing to be rational entails), but assuming that one is still irrational in the practically relevant sense after taking such care is not.
Robin: This topic needs a book, not a blog post. Repeated blog posts on it just end up saying the same thing over and over. The basic fact that your above assertion in simple form fails to explain is that science has progressed a great deal in impractical domains such as astronomy and evolutionary biology. Furthermore, in unusual cases humans do seem to behave in ways from which indicators reliable truth-seeking such as Aumann Agreement emerge. Feynman reported one such case in the paper "Truly Great Men" here http://www.amazon.com/Manha...while I have witnessed others.
On many fields, experts seem to have direct material costs on being wrong only if they introduce fundamentally revolutionary ideas rather than only push the boundaries.
Bryan Caplan is not inconsistent, because he says that people "tend to be", not "are" more irrational. If we are alive to such risks of irrationality, we are less likely to fall for them- otherwise Overcoming Bias has no rational point. OB might then have value in making ourselves feel good about ourselves, but not in increasing rationality. If OB has rational value, then people can increase their rationality by learning of irrationality traps: and then BC is justified in saying that his own direct evaluations are more reliable than the average.
What MV said. Caledonian, surely the truly rational don't waste time worrying about whether or not they're sane?
Bryan seems to mistakenly conclude that one is more justified in relying on one's own direct evaluation of arguments, relative to the evaluations of others, on topics where people tend to be more irrational.
But he doesn't suggest this as a piece of general advice (i.e. he would say 'I' or 'economists' rather than 'one'.)
If you (believe that you) have evidence that you're more rational than the average person, his is excellent advice. I do, Caplan does, everyone here does if they're honest. If you believe you're below average rationality, do the opposite. Sounds fine to me.
I am Dumb, but Dummies Lie
Image via Wikipedia One of the most interesting and challenging logical phenomena discovered in the 20th century was the rigorous study of the relationship, for...
"I do think far too many irrational people have a subjective perception that they have chosen to be rational for that be more than a rather weak guide to one's own rationality."
What does that matter to a rational person? No number of suicide rocks is sufficient to make me worry that I am one. Ditto creationists or astrology believers. In so far as I am rational, I "should" believe that I am rational as that belief will cause me to win. If I'm not rational I won't do what I "should" in any event so the point is moot.
Those who fall into madness believe they're finally becoming sane. Those who fall into sanity believe it's possible that they may be sane, but fear and doubt their own sanity.
Then fearing and doubting sanity is (not entirely communicable) evidence for sanity.
Caledonian, your portrayal of insanity seems very different from Caplan's. Have you read his paper, and if so, what do you think of it?
I think people have imperfect access to their rationality and are often motivated not to pursue correct beliefs.
Eliezer, in this post I didn't address the issue of how modest to be overall regarding your vs. others' evaluations. Instead I said confidence should be less but modesty should not be less for low-material-cost topics.
I find myself leaning toward Caplan. I too am suspicious of modesty on multiple grounds: it's easy to endorse modesty and then not change any of your opinions, it's a general-purpose counterargument, etc.
If Caplan has identified specific biases and compensated for all specific biases he knows, then he is justified in believing his beliefs to be above average. As for the possibility of remaining biases, he should adjust his probabilities in such a direction that he expects the discovery of remaining biases, like the discovery of remaining unknown unknowns, to shift his beliefs the same amount in either direction on average.
I'm also sympathetic to pleas not to lose sight of first-order object-level arguments in trying to adjust for just the right amount of modesty with respect to a crowd that one has just demonstrated to be individually and collectively insane in various specific ways. All judgments begin at the object level and its straight-out first-order verdict should never be neglected.
"Remember that limited expertise is better than none at all."
For some reason, I feel compelled to point out counterexamples; consider the history of medicine in Europe before, say, 1800. Doctors were more likely to hurt you than help you.
People have certain privileged knowledge about their own rationality that they don't have about that of other people or of suicide rock.
No, we don't. One of the first things rational people realize is that they cannot demonstrate their own rationality to themselves.
Those who fall into madness believe they're finally becoming sane. Those who fall into sanity believe it's possible that they may be sane, but fear and doubt their own sanity.
Robin, I think you may be confusing a heuristic with a causal relationship. Or maybe this is something that looks different from the outside than from the inside.
Imagine I make a testable prediction, and also wager a large sum of money that my prediction is correct. The fact that I have made this large wager might increase your confidence that my prediction is correct, and certainly should increase your estimation of my own confidence in my prediction. But I cannot increase my own confidence that my prediction was correct by betting on it.
The real problem with the objection is that no sensible person would feel that the author's professed confidence that his ideas are correct is a good reason for the reader to believe that those ideas are correct in the first place. Before reading the book, it's perfectly reasonable to suspect that the author has more confidence in his ideas than the facts merit. Most authors do. But it would be absurd to argue in a review of the book, "this book probably sucks, since most books do".
Michael, I did not claim the non-existence of individual or collective rationality. I do think far too many irrational people have a subjective perception that they have chosen to be rational for that be more than a rather weak guide to one's own rationality.
Science history is a massive list of existence proofs for individual and collective (and collective without cherry-picking) rationality in some important practically relevant sense in impractical domains.
People have certain privileged knowledge about their own rationality that they don't have about that of other people or of suicide rock.
From a personal perspective, one sensible approach can be to observe that people can largely be rational in the practically relevant sense when they choose to be and then to take the approach of saying that either I am choosing to be rational in the practically relevant sense or I am choosing not to be rational and in either case I will accomplish my goals better by claiming that I am being rational. Taking care to be rational may be justified (because doing so is what choosing to be rational entails), but assuming that one is still irrational in the practically relevant sense after taking such care is not.
Aaron, I think he'd say those are exceptions to a more general rule.
Michael, substituting "tend to be" for "am" sounds valid and sound.
Abigail, if Bryan is less biased overall because he is aware of biases,he'd still expect to be more biased on low-material-cost topics.
Michael, I don't see what this post has to do with science history.
Robin: This topic needs a book, not a blog post. Repeated blog posts on it just end up saying the same thing over and over. The basic fact that your above assertion in simple form fails to explain is that science has progressed a great deal in impractical domains such as astronomy and evolutionary biology. Furthermore, in unusual cases humans do seem to behave in ways from which indicators reliable truth-seeking such as Aumann Agreement emerge. Feynman reported one such case in the paper "Truly Great Men" here http://www.amazon.com/Manha...while I have witnessed others.
On many fields, experts seem to have direct material costs on being wrong only if they introduce fundamentally revolutionary ideas rather than only push the boundaries.
Bryan Caplan is not inconsistent, because he says that people "tend to be", not "are" more irrational. If we are alive to such risks of irrationality, we are less likely to fall for them- otherwise Overcoming Bias has no rational point. OB might then have value in making ourselves feel good about ourselves, but not in increasing rationality. If OB has rational value, then people can increase their rationality by learning of irrationality traps: and then BC is justified in saying that his own direct evaluations are more reliable than the average.