When you surround the enemy

    Always allow them an escape route.

    They must see that there is

    An alternative to death.

    —Sun Tzu, The Art of War

    Don’t raise the pressure, lower the wall.

    —Lois McMaster Bujold, Komarr

    I recently happened into a conversation with a nonrationalist who had somehow wandered into a local rationalists’ gathering. She had just declared (a) her belief in souls and (b) that she didn’t believe in cryonics because she believed the soul wouldn’t stay with the frozen body. I asked, “But how do you know that?”

    From the confusion that flashed on her face, it was pretty clear that this question had never occurred to her. I don’t say this in a bad way—she seemed like a nice person without any applied rationality training, just like most of the rest of the human species.

    Most of the ensuing conversation was on items already covered on Overcoming Bias—if you’re really curious about something, you probably can figure out a good way to test it, try to attain accurate beliefs first and then let your emotions flow from that, that sort of thing. But the conversation reminded me of one notion I haven’t covered here yet:

    “Make sure,” I suggested to her, “that you visualize what the world would be like if there are no souls, and what you would do about that. Don’t think about all the reasons that it can’t be that way; just accept it as a premise and then visualize the consequences. So that you’ll think, ‘Well, if there are no souls, I can just sign up for cryonics,’ or ‘If there is no God, I can just go on being moral anyway,’ rather than it being too horrifying to face. As a matter of self-respect, you should try to believe the truth no matter how uncomfortable it is, like I said before; but as a matter of human nature, it helps to make a belief less uncomfortable, before you try to evaluate the evidence for it.”

    The principle behind the technique is simple: as Sun Tzu advises you to do with your enemies, you must do with yourself—leave yourself a line of retreat, so that you will have less trouble retreating. The prospect of losing your job, for example, may seem a lot more scary when you can’t even bear to think about it than after you have calculated exactly how long your savings will last, and checked the job market in your area, and otherwise planned out exactly what to do next. Only then will you be ready to fairly assess the probability of keeping your job in the planned layoffs next month. Be a true coward, and plan out your retreat in detail—visualize every step—preferably before you first come to the battlefield.

    The hope is that it takes less courage to visualize an uncomfortable state of affairs as a thought experiment, than to consider how likely it is to be true. But then after you do the former, it becomes easier to do the latter.

    Remember that Bayesianism is precise—even if a scary proposition really should seem unlikely, it’s still important to count up all the evidence, for and against, exactly fairly, to arrive at the rational quantitative probability. Visualizing a scary belief does not mean admitting that you think, deep down, it’s probably true. You can visualize a scary belief on general principles of good mental housekeeping. “The thought you cannot think controls you more than thoughts you speak aloud”—this happens even if the unthinkable thought is false!

    The leave-a-line-of-retreat technique does require a certain minimum of self-honesty to use correctly.

    For a start: You must at least be able to admit to yourself which ideas scare you, and which ideas you are attached to. But this is a substantially less difficult test than fairly counting the evidence for an idea that scares you. Does it help if I say that I have occasion to use this technique myself? A rationalist does not reject all emotion, after all. There are ideas which scare me, yet I still believe to be false. There are ideas to which I know I am attached, yet I still believe to be true. But I still plan my retreats, not because I’m planning to retreat, but because planning my retreat in advance helps me think about the problem without attachment.

    But the greater test of self-honesty is to really accept the uncomfortable proposition as a premise, and figure out how you would really deal with it. When we’re faced with an uncomfortable idea, our first impulse is naturally to think of all the reasons why it can’t possibly be so. And so you will encounter a certain amount of psychological resistance in yourself, if you try to visualize exactly how the world would be, and what you would do about it, if My-Most-Precious-Belief were false, or My-Most-Feared-Belief were true.

    Think of all the people who say that without God, morality is impossible.1 If theists could visualize their real reaction to believing as a fact that God did not exist, they could realize that, no, they wouldn’t go around slaughtering babies. They could realize that atheists are reacting to the nonexistence of God in pretty much the way they themselves would, if they came to believe that. I say this, to show that it is a considerable challenge to visualize the way you really would react, to believing the opposite of a tightly held belief.

    Plus it’s always counterintuitive to realize that, yes, people do get over things. Newly minted quadriplegics are not as sad, six months later, as they expect to be, etc. It can be equally counterintuitive to realize that if the scary belief turned out to be true, you would come to terms with it somehow. Quadriplegics deal, and so would you.

    See also the Litany of Gendlin and the Litany of Tarski.  What is true is already so; owning up to it doesn't make it worse.  You shouldn't be afraid to just visualize a world you fear. If that world is already actual, visualizing it won't make it worse; and if it is not actual, visualizing it will do no harm.  And remember, as you visualize, that if the scary things you're imagining really are true—which they may not be!—then you would, indeed, want to believe it, and you should visualize that too; not believing wouldn't help you.

    How many religious people would retain their belief in God if they could accurately visualize that hypothetical world in which there was no God and they themselves have become atheists?

    Leaving a line of retreat is a powerful technique, but it’s not easy. Honest visualization doesn’t take as much effort as admitting outright that God doesn’t exist, but it does take an effort.

    1And yes, this topic did come up in the conversation; I’m not offering a strawman.

    New to LessWrong?

    New Comment
    75 comments, sorted by Click to highlight new comments since: Today at 8:25 AM
    Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

    How many rationalists would retain their belief in reason, if they could accurately visualize that hypothetical world in which there was no rationality and they themselves have become irrational?

    1[anonymous]11y
    I don't know. But I would. Irrationality is caused by ignorance, so there will always be tangent worlds (while regarding this current one as prime) in which I give up. There will always be a world where anything that is physically possible occurs. (and probably many where even that requirement doesn't hold) To put it another way, there has been a moment in time when I was not rational. Is that reason to give up rationality forever? Time could be just another dimension, it's manipulation as far out of our grasp as that of other possible worlds.

    if they could accurately visualize that hypothetical world in which there was no rationality and they themselves have become irrational?

    I just attempted to visualize such a world, and my mind ran into a brick wall. I can easily imagine a world in which I am not perfectly rational (and in fact am barely rational at all), and that world looks a lot like this world. But I can't imagine a world in which rationality doesn't exist, except as a world in which no decision-making entities exist. Because in any world in which there exist better and worse options and an entity that can model those options and choose between them with better than random chance, there exists a certain amount of rationality.

    5Ben Pace11y
    I suppose I'd just think about before I met LessWrong. I wouldn't choose that world.
    9Voltairina9y
    Well, a world that lacked rationality might be one in which all the events were a sequence of non-sequiters. A car drives down the street. Then dissappears. We are in a movie theater with a tyrannosaurus. Now we are a snail on the moon. Then there's just this poster of rocks. Then I can't remember what sight was like, but there's jazz music. Now I fondly remember fighting in world war 2, while evading the Empire with Hans solo. Oh! I think I might be boiling water, but with a sense of smell somehow.... that's a poor job of describing it -- too much familiar stuff - but you get the idea. If there was no connection between one state of affairs and the next, talking about what strategy to take might be impossible, or a brief possibility that then dissappears when you forget what you are doing and you're back in the movie theater again with the tyrannosaurus. If 'you' is even a meaningful way to describe a brief moment of awareness bubbling into being in that universe. Then again, if at any moment 'you' happen to exist and 'you' happen to understand what rationality means- I guess now that I think about it, if there is any situation where you can understand what the word rationality means, its probably one in which it exists (howevery briefly) and is potentially helpful to you, even if there is little useful to do about whatever situation you are in, there might be some useful thing to do about the troubling thoughts in your mind.
    5CCC9y
    While that is a world without rationality, it seems a fairly extreme case. Another example of a world without rationality is a world in which, the more you work towards achieving a goal, the longer it takes to reach that goal; so an elderly man might wander distractedly up Mount Everest to look for his false teeth with no trouble, but a team of experienced mountaineers won't be able to climb a small hill. Even if they try to follow the old man looking for his teeth, the universe notices their intent and conspires against them. And anyone who notices this tendency and tries to take advantage of it gets struck by lightning (even if they're in a submarine at the time) and killed instantly.
    6Voltairina9y
    That reminds me of Hofstadter's Law: "It will always take longer than you think it is going to take. Even when you take into account Hofstadter's Law."
    2JustinMElms8y
    I like both Volairina and your takes on the non-rational world. I was having a lot of trouble working something out. That said, while Voltairina's world is a bit more horrifyingly extreme than yours, it seems to me more probably that cause and effect simply did not exist. I can envision a structure of elementary physics that simply change--functionally randomly--far more easily than that causality does exist, but operates in the inverse. I have more trouble envisioning the elementary physics that bring that into existence without a observational intellect directly upsetting motivated plans. All that is to say, might not your case be the more extreme one?
    1CCC8y
    ...it's possible. There are many differences between our proposed worlds, and it really depends on what you mean by "more extreme". Volairina's world is "more extreme" in the sense that there are no rules, no patterns to take advantage of. My world is "more extreme" in that the rules actively punish rationality. My world requires that elementary physics somehow takes account of intent, and then actively subverts it. This means that it reacts in some way to something as nebulous as intent. This implies some level of understanding of the concept of intent. This, in turn, implies (as you state) an observational intellect - and worse, a directly malevolent one. Volairina's can exist without a directly malevolent intelligence directing things. So it really comes down to what you mean by "extreme", I guess. Both proposed worlds are extreme cases, in their own way.
    0JustinMElms8y
    Fair point.
    4Jotto99911y
    I'm not sure what "no rationality" would mean. Evolutionarily relevant kinds of rationality can still be expected, like preference to sexually fertile mates, fearing spiders/snakes/heights, and if we're still talking about something at all similar to Homo Sapiens, language and cultural learning and such, which require some amounts of rationality to use. I wonder if you might be imagining rationality in the form of essentialism, allowing you to universally turn the attribute off, but in reality there no such off switch that is compatible with having decision making agents.
    9Yosarian210y
    That's not the idea that really scares Less Wrong people. Here's a more disturbing one; try to picture a world where all the rational skills you're learning on Less Wrong are actually somehow flawed, and actually make it less likely that you'll discover the truth or made you correct less often, for whatever reason? What would that look like? Would you be able to tell the difference. I must say, I have trouble picturing that, but I can't prove it's not true (we are basically tinkering with the way our mind works without a software manual, after all).
    1accolade8y
    related: http://lesswrong.com/lw/9p/extreme_rationality_its_not_that_great/
    1Idan Arye4y
    No rationality, or no Bayesianism? Rationality is a general term for reasoning about reality. Bayesianism is the specific school of rationality advocated on LessWrong. A "world in which there was no rationality" is not even meaningful, just like "world in which there was no physics" is meaningless. Even if energy and matter behaves in a way that's completely alien to us, there are still laws that govern how it works and you can call these laws "physics". Similarly, even if we'd live in some hypothetical world where the rules of reasoning are not derived from Bayes' theorem, there are still rules that can be thought of as that reality's rationalism. A world without Bayesianism is easy to visualize, because we have all seen such worlds in fiction. Cartoons takes this to the extreme - Wile E. Coyote paints a tunnel and expects Road Runner to crash into it - but Road Runner manages to go through. Then he expects that if Road Runner could go through, he could go through as well - but he crashes into it when he tried. Coyote's problem is that his rationalism could have worked in our world - but he is not living in our world. He is living in a cartoon world with cartoon logic, and needs a different kind of rationalism. Like... the one Bugs Bunny uses. Bugs Bunny plugs Elmer Fudd's rifle with his finger. In our world, this could not stop the bullet. But Bugs Bunny is not living in our world - he lives in cartoon world. He correctly predicts that the rifle will explode without harming him, and his belief in that prediction is strong enough to bet his life on it. Now, one may claim that it is not rationality that gets messed up here - merely physics. But in the examples I picked it is not just that laws of nature that don't work like real world dwellers would expect - it is consistency itself that fails. Let us compare with superhero comics, where the limitations of physics are but a suggestion but at least some effort is done to maintain consistency. When mirror maste

    I enjoy the non-mathy posts. I believe Overcoming Bias is a worthy endeavor, and as a relatively new field of study, the math-oriented posts are important. They are often the most succinct and accurate way to convey concepts. With that said, I find that math posts to be dense with information, perhaps overly so. I find myself unconsciously starting to skim instead of read, and I find it difficult to force myself to pay attention.

    The mathy posts appeal to people who are serious about moving this burgeoning field forward, and the non-mathy posts appeal to people who are more casually interested in the concepts, and allow you to have a wider audience. You will have a balance between the two no matter what you attempt, the only question is what your intended audience is, and the best way to reach those people.

    3Odinn9y
    Not sure why you got a downvote. Displaying, or worse still obstinately defending, poor reasoning is a valid reason for getting a down (I got a big stack of them with a sloppy article and from rushed comments [working on making it better]) but admitting that you aren't a mathematically focused person and providing feedback on Eliezer's communication styles is no cause for it. Got my upvote.

    I enjoy all posts here, but would love a post on what does it mean to be rational. Something introductory, something you can link to when you talk with people who think "if you can justify what someone did, no matter what the justification is, the action becomes rational".

    then I am interested in hearing from you in the comments.

    While I appreciate the mathy posts as well as I can, as someone without much training in mathematics I really enjoy these types of posts (I've got a large backlog of your more mathy posts bookmarked for me to work through, whereas your non-mathy posts I read as soon as they show up in my feed reader).

    Let us have both!

    The ability to endure cognitive dissonance long enough to find the resolution to the dissonance, rather than just short-circuiting to something that makes no sense but offers relief from the strain, is a necessary precondition for rational thought.

    I don't think it can be cultivated, and I don't think there's a substitute. Either you pass through the gauntlet, or you don't.

    1SecondWind11y
    Couldn't you start with easier cognitive dissonances, and work your way up?

    I just want you to get to that "revelation" of yours already. I thought you were approaching it, if you're talking about neural nets and arithmetic coding. Where does it rank in your schedule? Or is this blog for human reasoning only?

    I was expecting to read yet another mathy post tonight, but I was dissapointed. Less mathy stuff is ok, but shouldn't really come at cost of anything intresting.

    I agree with Kriti - introductory essay, post, etc would be useful.

    I too prefer less mathy - well, to be precise I'll actually read the less mathy stuff in the first place.

    More to the point, I've stopped listening to news reports about global warming - and this is harming my ability to think rationally about it. I'll change the channel instead of hear someone say "You know how we all thought we've got 50 years to live? Turns out it's only 30/25/20."

    [Without having read the comments]

    WTF? You say: [...] I was actually advised to post something "fun", but I'd rather not [...]

    I think it was fun!

    BTW could we increase the probability of people being honest by basing reward not on individual choices, but on the log-likelihood over a sample of similar choices? (For a given meaning of similar.)

    As a mathematician I like your mathy posts, but this is also very welcome for a reason: it contains practical advice. Some posts are of little direct practical use but this one certainly is.

    Keep on the good work!

    "this is also very welcome" I'm refering to this post.

    [having read the comments]

    Kriti et al: I'd recommend this and this to anybody who hasn't already read it. Otherwise I have not much idea for introductory texts right now.

    I think you should go with the advice and post something fun. Especially so if you have "much important material" to cover in following months. No need for a big hurry to lose readers. ;)

    I should however note that one of the last mathy posts (Mutual Information) struck a chord with me and caused an "Aha!" moment for which I am grateful.

    Specifically, it was this:

    I digress here to remark that the symmetry of the expression for the mutual information shows that Y must tell us as much about Z, on average, as Z tells us about Y. I leave it as an exercise to the reader to reconcile this with anything they were taught in logic class about how, if all ravens are black, being allowed to reason Raven(x)->Black(x) doesn't mean you're al... (read more)

    I like non-mathy posts. I particularly enjoyed this one, as it seems to have a clear practical application.

    I liked this post, but then again, I like all your posts Eliezer! (I've just been hiding behind my feedreader, and so not commenting about it before.)

    My opinion about mathy/non-mathy is that you should do what you think is most natural. Most days, you'll probably want to get on with the mathy exposition (and I am very much looking forward to the more advanced mathy posts), and then sprinkle in something lighter when the occasions to do so arise. For instance, I like that you based today's post on a recent discussion you had.

    I believe this approach would be most conducive to interesting reading.

    'Newly minted quadriplegics'? What's more fun than that?

    Don't worry too much about who wants what when. Like you say, it's all important stuff, and at a post a day no-one's going to complain about the odd vignette. Just keep up the good work.

    Thank GOD for non-mathy posts ;-)

    There's a common literary technique used in most storytelling in which the author writes alternating "up" and "down" scenes -- it provides pacing and context; it also allows us time to digest the "up" scenes.

    It seems to me that the technique is appropriate here -- it might be worth making a goal for yourself to write a mathy post, then to follow up with a post on the same topic but without any math in it at all, except maybe references to the previous post. That would be an interesting exercise for you, I think. It's supposed ... (read more)

    There hasn't been much evidence of atheists forming groups that have the positive aspects that a church/synagogue/mosque holds in the social life of some humans. So you might forgive a theist pretending to be a rationalist, for not holding the probability of this happening very high, and that the world would lack said institutions and would be a worse place.

    If rationalists truly wants to get rid of religions, without getting rid of humans, we would have to ask ourselves, "What do humans get out of being part of a religion?" And then provide that ... (read more)

    Eliezer,

    You know that you can't succeed without the math, and slowing down for posts like this is taking away 24 hours that might have been better used to save humanity. Not that this was a bad post, but I think you would be better off letting others write the fun posts unless you need to write a fun post to recover from teaching.

    Eliezer, this was a welcome relief from the long series of mathy posts.

    Eliezer, suppose it turned out to be the case that:

    1) God exists. 2) At some time in the future, tomorrow, for example, God comes to Eliezer Yudkowsky in order announce His existence. 3) Not only does He announce His existence, but He is willing to have His existence and power tested, and passes every test. 4) He also asserts that according to Eliezer's CEV, although not according to his present knowledge, God's way of acting in the world is perfectly moral, even according to Eliezer's values.

    How would you react to these events? Would you write a post about them on OB?

    Thanks for feedback, all! The consensus appears to favor leavening mathy posts with less mathy ones. I'll bear that in mind, though I make no promises - I do have my own agenda here.


    Unknown, can't say I've ever thought of that one. I've considered how to kill or rewrite a Judeo-Christian type God, but not that particular scenario you've just described.

    I think I would simply reply to number 4, "I don't believe that without an explanation." After all, just because an entity displays great power doesn't mean it will always tell you the truth.

    Y... (read more)

    Alternatively, if you want something super scary, try 1), 2), and 3) without 4).

    I've considered how to kill or rewrite a Judeo-Christian type God

    Please make this your next "fun" post. (Speaking of which, I enjoy the digression.)

    You can't necessarily force me to consider believing number 4 because it involves a moral question and those are not subject to forced visualization (by this rule) in the way that factual scenarios are.

    But "my CEV judges killing babies as good" (unlike "killing babies is good") is a factual proposition. You know what your current moral judgments are, but you can't be certain what t... (read more)

    This reminds me of an item from a list of "horrible job interview questions" we once devised for SIAI:

    Would you kill babies if it was intrinsically the right thing to do? Yes/No

    If you circled "no", explain under what circumstances you would not do the right thing to do:


    If you circled "yes", how right would it have to be, for how many babies? ___

    What a horrible horrible question. My answer is ... what do you mean when you say "intrinsically the right thing to do"? The "right thing" according to whom? If it... (read more)

    "This reminds me of an item from a list of "horrible job interview questions" we once devised for SIAI:"

    Could you post these?

    "I've considered how to kill or rewrite a Judeo-Christian type God"

    Okay, now I'm curious what you've concluded with regards to that. :)

    Probably not worth doing more then just talking 'bout it in comments, if that, unless you feel like doing a post on that just for fun.

    But as far as this post, I also liked it. Useful to have actual suggestions for mental practices to practice to help one debias oneself.

    Why do the work of hypothesizing the world without God? It's not like Nietszche, Sartre, Camus, Marx, Shaw, Derrida, etc. haven't done a much better job of it than me, because they were better philosophers than me. However, I also consider Aquinas to be the better philosopher than the aforementioned. Is that so unreasonable?

    Thanks for reminding me of The Art of War from your quote. You might be interested in this great translation - http://www.sonshi.com/huynh.html

    "The mathy posts appeal to people who are serious about moving this burgeoning field forward, and the non-mathy posts appeal to people who are more casually interested in the concepts" - (Snappycrunch)

    Beware of mistaking mathematical thinking for rational thinking; math is a tool like any other, to be used rationally or irrationally. Nassim Taleb demonstrates this very well in his book "Fooled by Randomness".

    There's nothing casual about being interested in the concepts of rational thinking; even the mathematically minded (who will naturally be more interested in the mathy posts) need the concepts to understand what framework to put the math into.

    How does one go about visualizing a world without souls? Or, rather a world in which nobody believes in souls, and how would this visualization have any bearing to "reality"? It seems like the thought experiment is really: What would I do if everything were the same except I didn't have a soul?

    Regardless of all previous posts.

    I think you write better when you are expressing your beliefs and inner thoughts as opposed to the mathematical equation that leads you there.

    “Do not dwell in the past, do not dream of the future, concentrate the mind on the present moment.”

    Just a thought. Anna

    slowing down for posts like this is taking away 24 hours that might have been better used to save humanity.

    Sarcasm? Humour? Sincerity?

    I've considered how to kill or rewrite a Judeo-Christian type God

    Please make this your next "fun" post.

    Seconded!

    I've considered how to kill or rewrite a Judeo-Christian type God

    Obligatory Pascal: Ah, but what if there's a tiny chance that He's reading along to figure out our tactics?

    Steven: To kill or rewrite a Judeo-Christian God, obviously, the technique has to work even if the God can read your planning thoughts. It's a lot easier than dealing with an UFAI, though, because the Judeo-Christian God has anthropomorphic cognitive vulnerabilities and a considerable response time delay. ("You ate the apple?")

    Naturally you prefer to rewrite the God if possible - shame to waste all that power.

    Heh, so how do you know that it is not the case that this hypothetical JCG reads overcomingbias but not people's private thoughts?

    (Of course as long as we're under these weird assumptions then not discussing tactics could be a fatal mistake too, etc etc)

    I'm skeptical about the possibility of really carrying out this kind of visualization (or, more broadly, imaginary leap). Here's why.

    I might be able to say that I can imagine the existence of a god, and what the world would be like if, say, it were the Christian one. But I can't imagine myself in that world -- in that world, I'm a different person. For in that world, either I hold the counterfactually true belief that there is such a god, or I don't. If I don't hold that belief, then my response to that world is the same as my response to this world. ... (read more)

    This reminds me of an item from a list of "horrible job interview questions" we once devised for SIAI:

    Would you kill babies if it was intrinsically the right thing to do? Yes/No

    If you circled "no", explain under what circumstances you would not do the right thing to do: I assume by "intrinsically right thing to do", you do not intend something straightforward like "here are five babies carrying a virus which, if left unchecked, will wipe out half the population of the planet. There is no means by which they can be quaran... (read more)

    I would show the proof to as many moral philosophers as I could

    Boy, I sure wouldn't. Ever read Cherniak's "The Riddle of the Universe and Its Solution"?

    I am well aware that a fundamentalist could take my previous paragraph, replace "killing babies" with "oral sex" and thus make his prudery unassailable by argument. So much the worse for him, I say.

    I sympathize, but I don't think that really solves the dilemma.

    Post what you want to post most. The advice that you should go against your own instincts and pander is bad, in my opinion. The only things you should force yourself to do are: (1) try to post something every day, and (2) try to edit and delete comments as little as possible. I believe the result will be an excellent and authentic blog with the types of readers you want most (and that are most useful to you).

    Eliezer,

    I think there is pretty overwhelming evidence that moral philosophers are almost never moved to do anything nearly so onerous and dangerous as killing babies by their moral views. See Unger, Singer, Parfit, etc.

    That title confused me. I expected an article on how, when debating, it was better to leave the opponent a line of retreat so that they would not feel dialectically cornered and start panicking. Of course, along that line of retreat, your arguments would be waiting for them. Socrates apparently was a true master of this little dance. This is especially useful if you have a lot of time and you are trying to actually change the way your opponent thinks, rather than changing that of an audience.

    1timtyler13y
    I am pretty sure that is what the term "leaving a line of retreat" in the context of an argument or disagreement should be used to refer to. The meaning being proposed in this post is counter-intuitive. I classify it as being undesirable terminology.

    Great post!

    I think the greatest test of self honesty (maybe it ties with honestly imagining the world you wish weren't real) would be admitting to yourself that the world looks an awful lot like the hypotheticl world you just vividly imagined. I think if anyone who believes in god or homeopathy or what-have-you honestly imagined what the world would look like if their belief was wrong, and they had enough courage, they'd admit to themselves that the world looks a lot like that already.

    You really should write a book. Seriously. I could probably raise the hypothesis of teaching Rationality as a first-year course (as a follow-up to Logic) instead of useless "password" classes like I've received at my college. Having a book I could wave around with to convince people maybe being rational is important when you're a scientist would help a lot. At least I'd start printing and distributing it.

    You could also just put the primary sequences of this website into a (e)book format, and release it. You might reach a wider audience that way, which would of course be Winning.

    0thomblake11y
    A serious book on Rationality has been in the works for some time.
    0Nornagest11y
    There's a couple of ebook versions of the Sequences floating around. I believe an official release is still in the works, but links to several unofficial ones may be found here.
    2[anonymous]11y
    The trouble with the sequences is that each was written in the course of a day, and most were unrevised since then. They're obviously rich and interesting, but far from publishable material. The sequences meet every standard you could want for being insightful, but they fall far short of most standards of factual accuracy, organization, contact with contemporary discussions, etc.

    The hope is that it takes less courage to visualize an uncomfortable state of affairs as a thought experiment, than to consider how likely it is to be true. But then after you do the former, it becomes easier to do the latter.

    And again you manage to condense a wise life lesson to two sentences. I should really write them down.

    "How many religious people would retain their belief in God, if they could accurately visualize that hypothetical world in which there was no God and they themselves have become atheists?"

    More than a few. For example, if you are a Muslim in some places, accurately visualizing the world where you become an atheist means visualizing a world in which you get killed for apostasy.

    1shminux9y
    I don't think that's quite it. For many, the world where there is no God is like the world where you have no parents.

    Evangelism and creationism don't tend to go down very well here, but you know what's likely to go down even less well? Claiming to have conclusive evidence against things near-universally believed here (e.g., evolution) and not bothering to provide us with any of it.

    I don't want to mislead you; if you do tell us some of the things you regard as demonstrating that evolution is "a fairy tale", those things are not likely to get the sort of reception you would prefer them to get. (I say: because you're claiming to offer conclusive evidence of something that i... (read more)

    the clay telling the potter what he should do with the clay ... It's His game, His gameboard, His pieces, His rules, His decisions 

    In other words, might makes right? 

    The great sin against reason is not belief in a God, it's belief in a good God. But people cling to scraps of unreason and hope in order to endure this horror show of a world. 

    Are you related to Tom McCabe, who posted on this page years ago? Is there some tragedy that brings you here?