Ignorance About Intuitions
In common usage, intuitions lead us to believe things without being able to articulate evidence or reasons for those beliefs. Wikipedia.
I’m not offering you a phony seventeen-step “proof that murder is normally wrong.” Instead, I begin with concrete, specific cases where morality is obvious, and reason from there. Bryan Caplan.
For each of our beliefs, we can ask our mind to give our "reasons" for that belief. Our minds usually then offer reasons, though we usually don't know how much those reasons have to do with the actual causes of our belief. We can often test those reasons through criticism, increasing confidence when criticism is less effective than expected, and decreasing confidence when criticism is more effective than expected.
For some of our beliefs, our minds don't offer much in the way of reasons. We say these beliefs are more "intuitive." In a hostile debating context this response can seem suspicious; you might expect one side in a debate to refuse to offer reasons just when they had already tested those reasons against criticism, and found them wanting. That is, we might expect a debater to pretend he didn't have any reasons when he knew his reasons were bad.
But this doesn't obviously support much distrust of our own intuitive beliefs. Not only is our internal mind not obviously like a hostile debating context, but we must admit that our minds are built so that the vast majority of our thinking is unconscious. It is unreasonable to expect our minds to be able to tell us much in the way of reasons for most of our beliefs.
Furthermore, we must admit that even when we do have reasons for our beliefs, not only do our chains of reasoning usually end at pretty intuitive beliefs, but we usually can't even fully articulate why each step in a reasoning chain supports the next one. We clearly have little choice but to rely greatly on intuition.
But there still remains the question of how much to rely on intuition, when we do have a choice. Sometimes we seem to have a choice between just accepting belief confidence levels suggested by opaque subconscious processes, or employing more explicit reasoning processes, even if those more explicit processes still rely heavily on intuitive beliefs.
We find ourselves managing complex networks of beliefs. Bryan's picture seems to be of a long metal chain linked at only one end to a solid foundation; chains of reasoning mainly introduce errors, so we do best to find and hold close to our few most confident intuitions. My picture is more like Quine's "fabric," a large hammock made of string tied to hundreds of leaves of invisible trees; we can't trust each leaf much, but even so we can stay aloft by connecting each piece of string to many others and continually checking for and repairing broken strings.
But having identified differing pictures, what then? Perhaps Bryan is satisfied to just have an intuition preferring his long chain picture, but I want to find more explicit reasons to choose among pictures. I have to admit, however, that I don't have very much yet I can point to. I'm not even sure we have much in the way of empirical data on belief accuracy of folks who rely more versus less on intuition.
We remain sadly ignorant about intuition, an important neglected research area.
More added: We clearly vary across topics and people in when we rely how much on intuition. So the more precise question is if we can identify any biases in these judgements, when we rely too much or too little.