Bayesian probability is a great model of rationality that gets lots of important things right, but there are two ways in which its simple version, the one that comes most easily to mind, is extremely misleading.
This assumes that you care about being heard and making a difference. Personally these things make no difference to me; I just want to understand the world as much as I can. If it turns out that I come across a way of understanding it that is new and other people take notice of it then great!
Justoneminute had a good post on the first part of the subject--the small part of ourselves that we control. It's about a metaphor likening us to a boy riding an elephant (except that we may not even know we are riding the elephant or that there is an elephant):
I had to read this post twice to understand what Robin meant :D
But, Robin, how to influence this organism? History does show the system can be influenced (women's rights, green energy, equal rights for non-whites, human rights, etc). Is it just a matter of luck, or diligent ant work so that when "time is right" the opinion tilts?
(If you have written about it earlier, I am sorry to ask you to repeat!)
Lacking the information contained in the rest of your mind, you are vulnerable to bias. Generalizing to the rest of the world, lacking the information contained in all other minds, you are vulnerable to anthropic selection effects. Bias is just a special case of anthropics!
This was graphically illustrated in the Derren Brown episode on 'The System', where a member of the public was astonished to be receiving repeated correct predictions for horse racing, with the winning horses sometimes coming from impossible positions and getting astonishing good luck in the running. She was tricked by confiirmation bias because she only saw her own perspective. 'Panning out' to the wider 'people-scape' revealed the secret - anthropics - Derren was simultaneously sending tips to over 7000 people covering all possible combinations and weeding out the losers - only the film from the one remaining person that by chance received all the right tips was used.
That which controls the wider 'system' in which Bayes is embedded warps the probabilities by manipulating what other people see, this is the 'killer' flaw in Bayes.
The cognitive biases you see in individuals will not likely go completely away when you increase the scale. Although it will cancel out many biases, it will not be able to correct errors when and if the biases correlate. The errors are also likely to be severe in magnitude when and if they do occur. But I agree that a systematic study of cognitive biases in a group of agents who themselves have biases is important.
Robin made a similar argument a while back at LessWrong: Rational Me or We?
I'm also reminded of a point often made at Boettke's blog (formerly known as The Austrian Economists) about how Austrians should seek to get work published in high-impact non-Austrian journals so as to better spread Austrian ideas.
Agent00yak, the same joke was already made by n/a at r/h/e notes. In fact, I suspect you stole the pun from him.
It seems to me that either self-verifying topics are more common than that, or our "who to listen to" systems work better than you think. Take this for example. How did a mailing list posting by an uncredentialed and unaffiliated person (I was an undergrad at the time), describing a cryptographic protocol (not exactly pure math), gather eighty-some citations and get described as "influential" by review papers?
I haven't had too much trouble finding an audience for my later works (on largely unrelated topics, like my C++ crypto library, and decision theory) either, despite not having gotten any impressive credentials or affiliations since graduating from college. Was I just lucky that my efforts weren't wasted?
"Together we must manage systems for deciding who should be heard on what. Given such systems, each of us will make our strongest contributions, by far, by fitting into these systems."
Chinese leaders might be interested in managing systems for deciding who should be heard on what, though.
Think of system in terms of metabolism. When you do you will come to realize the current "system" is practicing cannibalism and therefore is not rational. Rational: that which adheres to the rules that iteratively bind Form, Function, Cause and Effect into the Universe
Matt, epistemic rationality may be a subset of instrumental rationality, or it may be different, but naively confusing the two does nobody any favors.
Bill, do you think that the current systems for deciding who should be heard on what are optimally managed?
This assumes that you care about being heard and making a difference. Personally these things make no difference to me; I just want to understand the world as much as I can. If it turns out that I come across a way of understanding it that is new and other people take notice of it then great!
I'm sympathetic to Bill's point of view up above.
The missing link from my previous post:
http://justoneminute.typepa...
Justoneminute had a good post on the first part of the subject--the small part of ourselves that we control. It's about a metaphor likening us to a boy riding an elephant (except that we may not even know we are riding the elephant or that there is an elephant):
I had to read this post twice to understand what Robin meant :D
But, Robin, how to influence this organism? History does show the system can be influenced (women's rights, green energy, equal rights for non-whites, human rights, etc). Is it just a matter of luck, or diligent ant work so that when "time is right" the opinion tilts?
(If you have written about it earlier, I am sorry to ask you to repeat!)
Lacking the information contained in the rest of your mind, you are vulnerable to bias. Generalizing to the rest of the world, lacking the information contained in all other minds, you are vulnerable to anthropic selection effects. Bias is just a special case of anthropics!
This was graphically illustrated in the Derren Brown episode on 'The System', where a member of the public was astonished to be receiving repeated correct predictions for horse racing, with the winning horses sometimes coming from impossible positions and getting astonishing good luck in the running. She was tricked by confiirmation bias because she only saw her own perspective. 'Panning out' to the wider 'people-scape' revealed the secret - anthropics - Derren was simultaneously sending tips to over 7000 people covering all possible combinations and weeding out the losers - only the film from the one remaining person that by chance received all the right tips was used.
That which controls the wider 'system' in which Bayes is embedded warps the probabilities by manipulating what other people see, this is the 'killer' flaw in Bayes.
.
The cognitive biases you see in individuals will not likely go completely away when you increase the scale. Although it will cancel out many biases, it will not be able to correct errors when and if the biases correlate. The errors are also likely to be severe in magnitude when and if they do occur. But I agree that a systematic study of cognitive biases in a group of agents who themselves have biases is important.
Robin made a similar argument a while back at LessWrong: Rational Me or We?
I'm also reminded of a point often made at Boettke's blog (formerly known as The Austrian Economists) about how Austrians should seek to get work published in high-impact non-Austrian journals so as to better spread Austrian ideas.
Agent00yak, the same joke was already made by n/a at r/h/e notes. In fact, I suspect you stole the pun from him.
I understand why you would think that.
It seems to me that either self-verifying topics are more common than that, or our "who to listen to" systems work better than you think. Take this for example. How did a mailing list posting by an uncredentialed and unaffiliated person (I was an undergrad at the time), describing a cryptographic protocol (not exactly pure math), gather eighty-some citations and get described as "influential" by review papers?
I haven't had too much trouble finding an audience for my later works (on largely unrelated topics, like my C++ crypto library, and decision theory) either, despite not having gotten any impressive credentials or affiliations since graduating from college. Was I just lucky that my efforts weren't wasted?
Rational is defined by The RMCM as that which arises directly out of Cause, Effect, Function, and Form©.The RMCM is indifferent to the word "systems" as it has become meaningless in the search for greater truth. This Exercise should at least give us some common encapsulations.
The point is that it's not just random noise.
I'm a little troubled by your comment:
"Together we must manage systems for deciding who should be heard on what. Given such systems, each of us will make our strongest contributions, by far, by fitting into these systems."
Chinese leaders might be interested in managing systems for deciding who should be heard on what, though.
economic system.
Think of system in terms of metabolism. When you do you will come to realize the current "system" is practicing cannibalism and therefore is not rational. Rational: that which adheres to the rules that iteratively bind Form, Function, Cause and Effect into the Universe