For the first time on this forum, I find myself agreeing almost entirely with a blogger's original post. Just thought I'd encourage Wilkinson and others who would follow his lead that he has struck a chord.
Wilkinson, I addressed many of these issues in Why Truth? and What's a Bias? If you find fault with my reasoning there, feel free to post it in a comment here. Incidentally, I aspire to dramatize and romanticize rationality because I believe Rationality is a dramatic and romantic endeavor, one of the great high melodies sung in the unfolding epic poem of Humankind.
I agree with Glen that truth is a subgoal of nearly any goal that requires cognition, up to and including walking across a room. When you successfully locate and sit down on a chair, you are committing an act of truthfinding no less than believing that humans evolved by natural selection. The politics and arguments and surrounding verbal bibblebabble are more complicated in the second case, but the math is the same.
Robin, we've differed on this before, but I still object to your calling people who deliberatively endorse truth yet engage in self-deluding behavior "hypocrites". Traditionally, a "hypocrite" is someone who verbally advocates a morality which they do not privately believe. Many people claim to believe in a morality, and internally believe they believe in the morality, yet commit some acts not in accordance with it; these people are traditionally called "sinners". People who say they believe in truth (honestly, without knowing intent to deceive you) and then self-deceive are sinners, not hypocrites.
Will, I see your point, but allow me to argue devil's advocate. To justify Bayes' Rule, we needn't say you have to value truth for truth's sake. We need only recognize that knowing the truth (or rather, true estimates of probability) is instrumental to the best achievement of your other goals, whatever those goals might be. So in this sense, Bayes' Rule is not culturally contingent at all, because it does not dictate your goals.
Now, it might also be costly to apply Bayes' Rule, so you might reasonably choose to conserve cognitive effort by applying Bayes' Rule only some of the time. That would be consistent with a rational allocation of scarce cognitive resources. But the non-culturally-contingent value of Bayes' Rule is demonstrated by the fact that if you could have its output provided to you for free, advancement of your own (possibly culturally contingent) goals would dictate using that output.
Would it be a kind of victory if people who now say that care about truth, but who really don't, started admitting that they really don't? Or, if more people stopped complimenting virtue through hypocrisy, would that actually damage the cultural prestige of truth? I'm not sure.
Will, "it" referred to overcoming bias. We would like more people to share our goals, but we have no assurance of how successful we can be. Probably our greatest lever is shame and hypocrisy, i.e., the fact that most people pretend to want what we (say we) want. If forced to choose between what they pretend to want and what they usually want, many may choose their pretensions.
"Some of us may want to, and that would be enough reason for us to gather in a place like this. Those who don't can go elsewhere."
But you're interested in making it so that more people want to, right?
"Most every group likes to feel superior by believing that its beliefs are less biased than other groups' beliefs. Consider today's "reality-based politics" or frequent Christian references to TRUTH."
Absolutely. And part of the debiasing project is to create an increasingly large group who is able to feel superior, and who a great many people acknowledge are RIGHT to feel superior, for its commitment to cognitive norms that really are more reliably truth-tracking than the more biased alternatives. No?
"What benefits could outweigh its costs?"
I'm not sure what you intend as the reference of 'its' in this sentence. Could you clarify?
2) Is the closest to my own position, although I don't think is a very helpful way of putting it. I don't in fact think that the desire for truth is culturally contingent, since a certain kinds of true beliefs are necessary for everyone at all times. It would be clearer to put it in the negative: the desire to weed out ALL or ALMOST ALL FALSEHOOD is culturally contingent. When you put it that way, I think it's obviously true.
I also meant "culturally contingent" in the plain sense that Bayes' Rule was discovered in a particular culture at a particular time in history. Additionally, also meant that I do not believe that Bayes' Rule, or other principles of good thinking we learn in school, is implicit in the very structure of human cognition in the way that, say, Kant thought that certain principles of theoretical and practical reason were. That is to say, according to Kant, if you are in the business of thinking or acting at all, you are already committed to certain normative principles that you MUST follow lest you somehow imperil your humanity. Bayes' Rule isn't like that. It is more human than not to violate it. To keep up the Kantian language, its normative bindingness comes from it's part in certain hypothetical imperatives. If you want to identify the probability of the truth of a proposition as accurately as possible, use Bayes' Rule. A large part of what I'm going on about is that accurately identifying the probability of the truth of propositions is an aim that individuals and cultures can prize more or less, and that if you want individuals and cultures to prize it more, you have to provide reasons they already care about as to why they ought to bear the cost of pursuing it. You may also need to dramatize and romanticize the enterprise as well.
What is the point of overcoming bias to get beliefs closer to truth? How natural is that? What benefits could outweigh its costs?
Some of us may want to, and that would be enough reason for us to gather in a place like this. Those who don't can go elsewhere.
What is completely natural and human is to claim to want to believe truth. Most every group likes to feel superior by believing that its beliefs are less biased than other groups' beliefs. Consider today's "reality-based politics" or frequent Christian references to TRUTH.
Wilkinson, you assert that rationality in the Bayesian sense is a culturally contingent construction. It would seem that you must therefore assert one or more of the following:
1) Truth is culturally contingent: there is no single reality; or beliefs and reality are incomparable.
2) Desire for truth is culturally contingent: rationality is a culturally contingent procedure because it rests on a culturally contingent desire.
3) Bayesian mathematics fails to attain the goal set for it by its own advocates: it is not a mathematically optimal albeit incomputable method for arriving as close to the truth as possible.
4) Computations can only make "mistakes" compared to other realistically performable computations: There is no sense in which, for example, assigning a probability P(A) < P(A&B) is a "mistake" just because it violates the Kolmogorov/Cox axioms for probability, unless we can exhibit an actual computation that says differently and does better. Furthermore, humans cannot be mistaken compared to Bayes-method computers, only compared to other procedures that a human could realistically perform.
For the first time on this forum, I find myself agreeing almost entirely with a blogger's original post. Just thought I'd encourage Wilkinson and others who would follow his lead that he has struck a chord.
Wilkinson, I addressed many of these issues in Why Truth? and What's a Bias? If you find fault with my reasoning there, feel free to post it in a comment here. Incidentally, I aspire to dramatize and romanticize rationality because I believe Rationality is a dramatic and romantic endeavor, one of the great high melodies sung in the unfolding epic poem of Humankind.
I agree with Glen that truth is a subgoal of nearly any goal that requires cognition, up to and including walking across a room. When you successfully locate and sit down on a chair, you are committing an act of truthfinding no less than believing that humans evolved by natural selection. The politics and arguments and surrounding verbal bibblebabble are more complicated in the second case, but the math is the same.
Robin, we've differed on this before, but I still object to your calling people who deliberatively endorse truth yet engage in self-deluding behavior "hypocrites". Traditionally, a "hypocrite" is someone who verbally advocates a morality which they do not privately believe. Many people claim to believe in a morality, and internally believe they believe in the morality, yet commit some acts not in accordance with it; these people are traditionally called "sinners". People who say they believe in truth (honestly, without knowing intent to deceive you) and then self-deceive are sinners, not hypocrites.
Will, I see your point, but allow me to argue devil's advocate. To justify Bayes' Rule, we needn't say you have to value truth for truth's sake. We need only recognize that knowing the truth (or rather, true estimates of probability) is instrumental to the best achievement of your other goals, whatever those goals might be. So in this sense, Bayes' Rule is not culturally contingent at all, because it does not dictate your goals.
Now, it might also be costly to apply Bayes' Rule, so you might reasonably choose to conserve cognitive effort by applying Bayes' Rule only some of the time. That would be consistent with a rational allocation of scarce cognitive resources. But the non-culturally-contingent value of Bayes' Rule is demonstrated by the fact that if you could have its output provided to you for free, advancement of your own (possibly culturally contingent) goals would dictate using that output.
Would it be a kind of victory if people who now say that care about truth, but who really don't, started admitting that they really don't? Or, if more people stopped complimenting virtue through hypocrisy, would that actually damage the cultural prestige of truth? I'm not sure.
Will, "it" referred to overcoming bias. We would like more people to share our goals, but we have no assurance of how successful we can be. Probably our greatest lever is shame and hypocrisy, i.e., the fact that most people pretend to want what we (say we) want. If forced to choose between what they pretend to want and what they usually want, many may choose their pretensions.
"Some of us may want to, and that would be enough reason for us to gather in a place like this. Those who don't can go elsewhere."
But you're interested in making it so that more people want to, right?
"Most every group likes to feel superior by believing that its beliefs are less biased than other groups' beliefs. Consider today's "reality-based politics" or frequent Christian references to TRUTH."
Absolutely. And part of the debiasing project is to create an increasingly large group who is able to feel superior, and who a great many people acknowledge are RIGHT to feel superior, for its commitment to cognitive norms that really are more reliably truth-tracking than the more biased alternatives. No?
"What benefits could outweigh its costs?"
I'm not sure what you intend as the reference of 'its' in this sentence. Could you clarify?
2) Is the closest to my own position, although I don't think is a very helpful way of putting it. I don't in fact think that the desire for truth is culturally contingent, since a certain kinds of true beliefs are necessary for everyone at all times. It would be clearer to put it in the negative: the desire to weed out ALL or ALMOST ALL FALSEHOOD is culturally contingent. When you put it that way, I think it's obviously true.
I also meant "culturally contingent" in the plain sense that Bayes' Rule was discovered in a particular culture at a particular time in history. Additionally, also meant that I do not believe that Bayes' Rule, or other principles of good thinking we learn in school, is implicit in the very structure of human cognition in the way that, say, Kant thought that certain principles of theoretical and practical reason were. That is to say, according to Kant, if you are in the business of thinking or acting at all, you are already committed to certain normative principles that you MUST follow lest you somehow imperil your humanity. Bayes' Rule isn't like that. It is more human than not to violate it. To keep up the Kantian language, its normative bindingness comes from it's part in certain hypothetical imperatives. If you want to identify the probability of the truth of a proposition as accurately as possible, use Bayes' Rule. A large part of what I'm going on about is that accurately identifying the probability of the truth of propositions is an aim that individuals and cultures can prize more or less, and that if you want individuals and cultures to prize it more, you have to provide reasons they already care about as to why they ought to bear the cost of pursuing it. You may also need to dramatize and romanticize the enterprise as well.
For the first time on this forum, I find myself agreeing almost entirely with a blogger's original post. Just thought I'd encourage Wilkinson and others who would follow his lead that he has struck a chord.
Wilkinson, I addressed many of these issues in Why Truth? and What's a Bias? If you find fault with my reasoning there, feel free to post it in a comment here. Incidentally, I aspire to dramatize and romanticize rationality because I believe Rationality is a dramatic and romantic endeavor, one of the great high melodies sung in the unfolding epic poem of Humankind.
I agree with Glen that truth is a subgoal of nearly any goal that requires cognition, up to and including walking across a room. When you successfully locate and sit down on a chair, you are committing an act of truthfinding no less than believing that humans evolved by natural selection. The politics and arguments and surrounding verbal bibblebabble are more complicated in the second case, but the math is the same.
Robin, we've differed on this before, but I still object to your calling people who deliberatively endorse truth yet engage in self-deluding behavior "hypocrites". Traditionally, a "hypocrite" is someone who verbally advocates a morality which they do not privately believe. Many people claim to believe in a morality, and internally believe they believe in the morality, yet commit some acts not in accordance with it; these people are traditionally called "sinners". People who say they believe in truth (honestly, without knowing intent to deceive you) and then self-deceive are sinners, not hypocrites.
Will, I see your point, but allow me to argue devil's advocate. To justify Bayes' Rule, we needn't say you have to value truth for truth's sake. We need only recognize that knowing the truth (or rather, true estimates of probability) is instrumental to the best achievement of your other goals, whatever those goals might be. So in this sense, Bayes' Rule is not culturally contingent at all, because it does not dictate your goals.
Now, it might also be costly to apply Bayes' Rule, so you might reasonably choose to conserve cognitive effort by applying Bayes' Rule only some of the time. That would be consistent with a rational allocation of scarce cognitive resources. But the non-culturally-contingent value of Bayes' Rule is demonstrated by the fact that if you could have its output provided to you for free, advancement of your own (possibly culturally contingent) goals would dictate using that output.
Robin, Ah! Got it.
Would it be a kind of victory if people who now say that care about truth, but who really don't, started admitting that they really don't? Or, if more people stopped complimenting virtue through hypocrisy, would that actually damage the cultural prestige of truth? I'm not sure.
Will, "it" referred to overcoming bias. We would like more people to share our goals, but we have no assurance of how successful we can be. Probably our greatest lever is shame and hypocrisy, i.e., the fact that most people pretend to want what we (say we) want. If forced to choose between what they pretend to want and what they usually want, many may choose their pretensions.
Robin,
You say:
"Some of us may want to, and that would be enough reason for us to gather in a place like this. Those who don't can go elsewhere."
But you're interested in making it so that more people want to, right?
"Most every group likes to feel superior by believing that its beliefs are less biased than other groups' beliefs. Consider today's "reality-based politics" or frequent Christian references to TRUTH."
Absolutely. And part of the debiasing project is to create an increasingly large group who is able to feel superior, and who a great many people acknowledge are RIGHT to feel superior, for its commitment to cognitive norms that really are more reliably truth-tracking than the more biased alternatives. No?
"What benefits could outweigh its costs?"
I'm not sure what you intend as the reference of 'its' in this sentence. Could you clarify?
Eliezer,
2) Is the closest to my own position, although I don't think is a very helpful way of putting it. I don't in fact think that the desire for truth is culturally contingent, since a certain kinds of true beliefs are necessary for everyone at all times. It would be clearer to put it in the negative: the desire to weed out ALL or ALMOST ALL FALSEHOOD is culturally contingent. When you put it that way, I think it's obviously true.
I also meant "culturally contingent" in the plain sense that Bayes' Rule was discovered in a particular culture at a particular time in history. Additionally, also meant that I do not believe that Bayes' Rule, or other principles of good thinking we learn in school, is implicit in the very structure of human cognition in the way that, say, Kant thought that certain principles of theoretical and practical reason were. That is to say, according to Kant, if you are in the business of thinking or acting at all, you are already committed to certain normative principles that you MUST follow lest you somehow imperil your humanity. Bayes' Rule isn't like that. It is more human than not to violate it. To keep up the Kantian language, its normative bindingness comes from it's part in certain hypothetical imperatives. If you want to identify the probability of the truth of a proposition as accurately as possible, use Bayes' Rule. A large part of what I'm going on about is that accurately identifying the probability of the truth of propositions is an aim that individuals and cultures can prize more or less, and that if you want individuals and cultures to prize it more, you have to provide reasons they already care about as to why they ought to bear the cost of pursuing it. You may also need to dramatize and romanticize the enterprise as well.
What is the point of overcoming bias to get beliefs closer to truth? How natural is that? What benefits could outweigh its costs?
Some of us may want to, and that would be enough reason for us to gather in a place like this. Those who don't can go elsewhere.
What is completely natural and human is to claim to want to believe truth. Most every group likes to feel superior by believing that its beliefs are less biased than other groups' beliefs. Consider today's "reality-based politics" or frequent Christian references to TRUTH.
Wilkinson, you assert that rationality in the Bayesian sense is a culturally contingent construction. It would seem that you must therefore assert one or more of the following:
1) Truth is culturally contingent: there is no single reality; or beliefs and reality are incomparable.
2) Desire for truth is culturally contingent: rationality is a culturally contingent procedure because it rests on a culturally contingent desire.
3) Bayesian mathematics fails to attain the goal set for it by its own advocates: it is not a mathematically optimal albeit incomputable method for arriving as close to the truth as possible.
4) Computations can only make "mistakes" compared to other realistically performable computations: There is no sense in which, for example, assigning a probability P(A) < P(A&B) is a "mistake" just because it violates the Kolmogorov/Cox axioms for probability, unless we can exhibit an actual computation that says differently and does better. Furthermore, humans cannot be mistaken compared to Bayes-method computers, only compared to other procedures that a human could realistically perform.
For the first time on this forum, I find myself agreeing almost entirely with a blogger's original post. Just thought I'd encourage Wilkinson and others who would follow his lead that he has struck a chord.
Wilkinson, I addressed many of these issues in Why Truth? and What's a Bias? If you find fault with my reasoning there, feel free to post it in a comment here. Incidentally, I aspire to dramatize and romanticize rationality because I believe Rationality is a dramatic and romantic endeavor, one of the great high melodies sung in the unfolding epic poem of Humankind.
I agree with Glen that truth is a subgoal of nearly any goal that requires cognition, up to and including walking across a room. When you successfully locate and sit down on a chair, you are committing an act of truthfinding no less than believing that humans evolved by natural selection. The politics and arguments and surrounding verbal bibblebabble are more complicated in the second case, but the math is the same.
Robin, we've differed on this before, but I still object to your calling people who deliberatively endorse truth yet engage in self-deluding behavior "hypocrites". Traditionally, a "hypocrite" is someone who verbally advocates a morality which they do not privately believe. Many people claim to believe in a morality, and internally believe they believe in the morality, yet commit some acts not in accordance with it; these people are traditionally called "sinners". People who say they believe in truth (honestly, without knowing intent to deceive you) and then self-deceive are sinners, not hypocrites.
Will, I see your point, but allow me to argue devil's advocate. To justify Bayes' Rule, we needn't say you have to value truth for truth's sake. We need only recognize that knowing the truth (or rather, true estimates of probability) is instrumental to the best achievement of your other goals, whatever those goals might be. So in this sense, Bayes' Rule is not culturally contingent at all, because it does not dictate your goals.
Now, it might also be costly to apply Bayes' Rule, so you might reasonably choose to conserve cognitive effort by applying Bayes' Rule only some of the time. That would be consistent with a rational allocation of scarce cognitive resources. But the non-culturally-contingent value of Bayes' Rule is demonstrated by the fact that if you could have its output provided to you for free, advancement of your own (possibly culturally contingent) goals would dictate using that output.
Robin, Ah! Got it.
Would it be a kind of victory if people who now say that care about truth, but who really don't, started admitting that they really don't? Or, if more people stopped complimenting virtue through hypocrisy, would that actually damage the cultural prestige of truth? I'm not sure.
Will, "it" referred to overcoming bias. We would like more people to share our goals, but we have no assurance of how successful we can be. Probably our greatest lever is shame and hypocrisy, i.e., the fact that most people pretend to want what we (say we) want. If forced to choose between what they pretend to want and what they usually want, many may choose their pretensions.
Robin,
You say:
"Some of us may want to, and that would be enough reason for us to gather in a place like this. Those who don't can go elsewhere."
But you're interested in making it so that more people want to, right?
"Most every group likes to feel superior by believing that its beliefs are less biased than other groups' beliefs. Consider today's "reality-based politics" or frequent Christian references to TRUTH."
Absolutely. And part of the debiasing project is to create an increasingly large group who is able to feel superior, and who a great many people acknowledge are RIGHT to feel superior, for its commitment to cognitive norms that really are more reliably truth-tracking than the more biased alternatives. No?
"What benefits could outweigh its costs?"
I'm not sure what you intend as the reference of 'its' in this sentence. Could you clarify?
Eliezer,
2) Is the closest to my own position, although I don't think is a very helpful way of putting it. I don't in fact think that the desire for truth is culturally contingent, since a certain kinds of true beliefs are necessary for everyone at all times. It would be clearer to put it in the negative: the desire to weed out ALL or ALMOST ALL FALSEHOOD is culturally contingent. When you put it that way, I think it's obviously true.
I also meant "culturally contingent" in the plain sense that Bayes' Rule was discovered in a particular culture at a particular time in history. Additionally, also meant that I do not believe that Bayes' Rule, or other principles of good thinking we learn in school, is implicit in the very structure of human cognition in the way that, say, Kant thought that certain principles of theoretical and practical reason were. That is to say, according to Kant, if you are in the business of thinking or acting at all, you are already committed to certain normative principles that you MUST follow lest you somehow imperil your humanity. Bayes' Rule isn't like that. It is more human than not to violate it. To keep up the Kantian language, its normative bindingness comes from it's part in certain hypothetical imperatives. If you want to identify the probability of the truth of a proposition as accurately as possible, use Bayes' Rule. A large part of what I'm going on about is that accurately identifying the probability of the truth of propositions is an aim that individuals and cultures can prize more or less, and that if you want individuals and cultures to prize it more, you have to provide reasons they already care about as to why they ought to bear the cost of pursuing it. You may also need to dramatize and romanticize the enterprise as well.