"I don't want to look to deeply at that" might well be a slippery slope to dishonesty when you do it because you fear the implications of "that" would be unpleasant. But there are other reasons to say "I don't want to look to deeply at that" that seem to me to be at worst neutral with respect to misleading oneself. Consider, for example, David Friedman's "When I am picking problems to work on, ones that stumped John von Neumann go at the bottom of the stack."
but I think it is often the case that the "I don't want to look to deeply at that" strategy is equivalent to self-deception.
It is, very much so. I wouldn't advocate it. But it doesn't have the cost that believing a lie does. If you can't face an inconvenient truth (and there may be valid reasons for that, especially if you have other pressures at the time) it's much better to turn away than to lie.
Recently I was looking for a partner in a little business application, and the guy noted up front: "I'm the most honest guy you will ever meet". I inferred he was probably below average in honesty.
"There's only one way to tell if a man is honest - ask him. If he says yes, then you know he's crooked." - Groucho Marx
I developed an irrational commitment to "honesty" after playing the Ultima computer games as a child. As a result, I spend a ridiculous amount of energy trying to figure out how to deceive people by saying true statements when telling a lie would be a lot simpler. ;)
Mike, as too your final paragrap, it's not necessarily a binary option. It might be fruitful to study the levels of battling advocates vs. discussing analysts that are optimal for various goal achievements for human social groups, be they scientific discovery or policy decision making.
I think Mr. Yudkowsky was referring to my first post when addressing this question of why we should be honest. I read the post he linked to and found it very thoughtful. I tend to agree with an idea suggested in another post, which is that it seems tough for us to consciously self-deceive. “I will lie to myself and I will believe the lie.” I think we perhaps do have means to help ourselves believe unlikely or plainly false things by not investing much energy in truth-seeking, or in memory retention (Perhaps by not thinking about some upsetting thing, the memory atrophies and one starts to not be able to ‘remember’ what is true, or doesn’t have ready access to the memory).
That said, I think that there are reasonable arguments in favor of allowing certain possible falsehoods to stand. For example, imagine a thought occurs to you that there is a certain reasonable possibility that your ‘sister’ is actually your ‘mother’, and your ‘parents’ are actually your ‘grandparents.’ You might decide not to investigate, since you believe (perhaps not unreasonably) that if the truth was your ‘sister’ was actually your mother, you’d suffer serious mental upset. You compare this possible mental upset with the discomfort of the alternative, not being certain of the truth of who your parents are, and decide that you’d prefer uncomfortable uncertainty to a chance of painful certainty or relief (this seems in keeping with a study of people’s preferences, if I remember right).
Also, wishful thinking might be good for you to a degree. I recall some study suggesting depressed people tend to have more realistic views of the world than the non-depressed. Also, I recall reading the Mark Twain quotation, “With arrogance and ignorance, success is assured.” To some extent, would we embark on difficult matters that were worthwhile in retrospect, if we had known how much we would suffer, and how much we were mistaken when taking on an endeavor? Perhaps it’s better to be a bit arrogant and ignorant—to an extent. My thought is, when should we overcome our biases and when should we leave them alone?
One final thought that I’ve contemplated—maybe it’s more economical in some cases to allow ourselves to be biased, if others will correct our biases for us. Would in some cases it make sense to be more lawyer-like in our advocacy of an idea, because others will be sufficiently lawyer-like in opposition to the idea, and debate would perhaps be more vigorous than if we were all calm-headed scientists?
Stuart, you make a good point, but I think it is often the case that the "I don't want to look to deeply at that" strategy is equivalent to self-deception.
I doubt many people would ever think "I will willingly deceive myself on this." They would most likely react like Robin did above - "I don't want to look too deeply at that." That sort of dishonesty doesn't have to propagate, if its left unchallenged.
That is, if I choose to be more honest about relationships, might I end up being less honest about other things? Or will expressing courage in one area give me more courage in other areas?
I see personal overcoming bias as an act of monumental arrogance - the belief that by overcoming our own biases, we will become more able to overcome the biases in the world, and help improve that world. We don't know it'll work, but we can only be sure of that once enough people have tried. So we're the lab rats that will hopefully answer that question, Robin.
I don't know if honesty makes the honesty muscles tougher, but I do think that it would be a catastrophic step if I were to say, for the first time, "I will willingly deceive myself on this." Dishonesty propagates. Once you lie about something, you have to lie about every truth entangled with the original truth, lie about how you chose your arguments, lie about what kinds of arguments are valid, lie about whether you lied. Dishonesty to yourself double propagates for triple, because you cannot even choose to remain silent.
Honesty may or may not become any easier, but willing self-deception certainly would get easier over time.
The muscle metaphor that Arnold Kling uses (and I think this metaphor is also used in the paper cited by Tyler Cowen) seems to fit best in my opinion. If I lift weights this morning, I'll probably reduce my ability to lift weights tonight. Tomorrow my muscles might have rested and been repaired and the muscles are ready to lift around the same weight I did today. If I lift weights regularly, over time I can increase the amount of weight I can reliably lift. It takes time for my muscles to be built up, but also it will take some time for them to atrophy. Willpower seems like this, a bit.
Another possibility is that over time we become more efficient in using our willpower, through learning what works and what doesn't. This might cause us to feel we're gaining willpower, when in reality we're just getting more bang for our buck.
With respect to the movie: is it plausible instead that there is some cognitive bias that prevents you from admitting that you wanted to look/think away because you were in fact bored? After all, the characters are supposed to be "genuine" and "engaging" so there must be some other reason ...
Recently I was looking for a partner in a little business application, and the guy noted up front: "I'm the most honest guy you will ever meet". I inferred he was probably below average in honesty.
1. Why are we honest? It seems to me honesty is a means to an end, and if the end isn't particularly clear to us, honesty may feel pointless. "Why am I bothering to understand this reality if it bothers me so much. Is it going to serve me well?" We have limited memory and physical energy--why spend it focusing on knowing truths that don't serve any positive end?
2. I found the paper on motivation that Tyler Cowen referred to very compelling (I admit I skimmed it rather than read it). It makes sense that when you're trying to quit smoking that you shouldn't try to also start a crash diet and a new exercise regime.
Who would then have the most willpower? Someone who exercises it frequently and has allowed it to replenish? Perhaps someone who frequently works hard and has just returned from a vacation? Is this the time to do your hardest tasks?
I recall an account of one of my favorite people, Ulysses S. Grant, known for his willpower, that suggested when he was in between battles, he was rather indolent, which might suggest he grew his will power through practice, and perhaps intuitively replenished it for difficult tasks by remaining slack, so to speak, prior to starting the difficult task. He said he had a practice: once he'd resolved to do something, he wouldn't stop until it was done, which might have helped him build his willpower muscle.
"I don't want to look to deeply at that" might well be a slippery slope to dishonesty when you do it because you fear the implications of "that" would be unpleasant. But there are other reasons to say "I don't want to look to deeply at that" that seem to me to be at worst neutral with respect to misleading oneself. Consider, for example, David Friedman's "When I am picking problems to work on, ones that stumped John von Neumann go at the bottom of the stack."
but I think it is often the case that the "I don't want to look to deeply at that" strategy is equivalent to self-deception.
It is, very much so. I wouldn't advocate it. But it doesn't have the cost that believing a lie does. If you can't face an inconvenient truth (and there may be valid reasons for that, especially if you have other pressures at the time) it's much better to turn away than to lie.
Recently I was looking for a partner in a little business application, and the guy noted up front: "I'm the most honest guy you will ever meet". I inferred he was probably below average in honesty.
"There's only one way to tell if a man is honest - ask him. If he says yes, then you know he's crooked." - Groucho Marx
I developed an irrational commitment to "honesty" after playing the Ultima computer games as a child. As a result, I spend a ridiculous amount of energy trying to figure out how to deceive people by saying true statements when telling a lie would be a lot simpler. ;)
Mike, as too your final paragrap, it's not necessarily a binary option. It might be fruitful to study the levels of battling advocates vs. discussing analysts that are optimal for various goal achievements for human social groups, be they scientific discovery or policy decision making.
I think Mr. Yudkowsky was referring to my first post when addressing this question of why we should be honest. I read the post he linked to and found it very thoughtful. I tend to agree with an idea suggested in another post, which is that it seems tough for us to consciously self-deceive. “I will lie to myself and I will believe the lie.” I think we perhaps do have means to help ourselves believe unlikely or plainly false things by not investing much energy in truth-seeking, or in memory retention (Perhaps by not thinking about some upsetting thing, the memory atrophies and one starts to not be able to ‘remember’ what is true, or doesn’t have ready access to the memory).
That said, I think that there are reasonable arguments in favor of allowing certain possible falsehoods to stand. For example, imagine a thought occurs to you that there is a certain reasonable possibility that your ‘sister’ is actually your ‘mother’, and your ‘parents’ are actually your ‘grandparents.’ You might decide not to investigate, since you believe (perhaps not unreasonably) that if the truth was your ‘sister’ was actually your mother, you’d suffer serious mental upset. You compare this possible mental upset with the discomfort of the alternative, not being certain of the truth of who your parents are, and decide that you’d prefer uncomfortable uncertainty to a chance of painful certainty or relief (this seems in keeping with a study of people’s preferences, if I remember right).
Also, wishful thinking might be good for you to a degree. I recall some study suggesting depressed people tend to have more realistic views of the world than the non-depressed. Also, I recall reading the Mark Twain quotation, “With arrogance and ignorance, success is assured.” To some extent, would we embark on difficult matters that were worthwhile in retrospect, if we had known how much we would suffer, and how much we were mistaken when taking on an endeavor? Perhaps it’s better to be a bit arrogant and ignorant—to an extent. My thought is, when should we overcome our biases and when should we leave them alone?
One final thought that I’ve contemplated—maybe it’s more economical in some cases to allow ourselves to be biased, if others will correct our biases for us. Would in some cases it make sense to be more lawyer-like in our advocacy of an idea, because others will be sufficiently lawyer-like in opposition to the idea, and debate would perhaps be more vigorous than if we were all calm-headed scientists?
Stuart, you make a good point, but I think it is often the case that the "I don't want to look to deeply at that" strategy is equivalent to self-deception.
I doubt many people would ever think "I will willingly deceive myself on this." They would most likely react like Robin did above - "I don't want to look too deeply at that." That sort of dishonesty doesn't have to propagate, if its left unchallenged.
That is, if I choose to be more honest about relationships, might I end up being less honest about other things? Or will expressing courage in one area give me more courage in other areas?
I see personal overcoming bias as an act of monumental arrogance - the belief that by overcoming our own biases, we will become more able to overcome the biases in the world, and help improve that world. We don't know it'll work, but we can only be sure of that once enough people have tried. So we're the lab rats that will hopefully answer that question, Robin.
Since someone asked "Why should we be honest [with ourselves]?" I'll go ahead and link yet again to http://www.overcomingbias.com/2006/11/why_truth_and.html.
I don't know if honesty makes the honesty muscles tougher, but I do think that it would be a catastrophic step if I were to say, for the first time, "I will willingly deceive myself on this." Dishonesty propagates. Once you lie about something, you have to lie about every truth entangled with the original truth, lie about how you chose your arguments, lie about what kinds of arguments are valid, lie about whether you lied. Dishonesty to yourself double propagates for triple, because you cannot even choose to remain silent.
Honesty may or may not become any easier, but willing self-deception certainly would get easier over time.
The muscle metaphor that Arnold Kling uses (and I think this metaphor is also used in the paper cited by Tyler Cowen) seems to fit best in my opinion. If I lift weights this morning, I'll probably reduce my ability to lift weights tonight. Tomorrow my muscles might have rested and been repaired and the muscles are ready to lift around the same weight I did today. If I lift weights regularly, over time I can increase the amount of weight I can reliably lift. It takes time for my muscles to be built up, but also it will take some time for them to atrophy. Willpower seems like this, a bit.
Another possibility is that over time we become more efficient in using our willpower, through learning what works and what doesn't. This might cause us to feel we're gaining willpower, when in reality we're just getting more bang for our buck.
Tom, I've been bored many times before with different effects, so this cannot be just boredom.
Mike, sounds like you are saying both Tylers are right in different cases.
With respect to the movie: is it plausible instead that there is some cognitive bias that prevents you from admitting that you wanted to look/think away because you were in fact bored? After all, the characters are supposed to be "genuine" and "engaging" so there must be some other reason ...
Recently I was looking for a partner in a little business application, and the guy noted up front: "I'm the most honest guy you will ever meet". I inferred he was probably below average in honesty.
A few thoughts:
1. Why are we honest? It seems to me honesty is a means to an end, and if the end isn't particularly clear to us, honesty may feel pointless. "Why am I bothering to understand this reality if it bothers me so much. Is it going to serve me well?" We have limited memory and physical energy--why spend it focusing on knowing truths that don't serve any positive end?
2. I found the paper on motivation that Tyler Cowen referred to very compelling (I admit I skimmed it rather than read it). It makes sense that when you're trying to quit smoking that you shouldn't try to also start a crash diet and a new exercise regime.
Who would then have the most willpower? Someone who exercises it frequently and has allowed it to replenish? Perhaps someone who frequently works hard and has just returned from a vacation? Is this the time to do your hardest tasks?
I recall an account of one of my favorite people, Ulysses S. Grant, known for his willpower, that suggested when he was in between battles, he was rather indolent, which might suggest he grew his will power through practice, and perhaps intuitively replenished it for difficult tasks by remaining slack, so to speak, prior to starting the difficult task. He said he had a practice: once he'd resolved to do something, he wouldn't stop until it was done, which might have helped him build his willpower muscle.