Followup to: The Thing That I Protect
Anything done with an ulterior motive has to be done with a pure heart. You cannot serve your ulterior motive, without faithfully prosecuting your overt purpose as a thing in its own right, that has its own integrity. If, for example, you're writing about rationality with the intention of recruiting people to your utilitarian Cause, then you cannot talk too much about your Cause, or you will fail to successfully write about rationality.
This doesn't mean that you never say anything about your Cause, but there's a balance to be struck. "A fanatic is someone who can't change his mind and won't change the subject."
In previous months, I've pushed this balance too far toward talking about Singularity-related things. And this was for (first-order) selfish reasons on my part; I was finally GETTING STUFF SAID that had been building up painfully in my brain for FRICKIN' YEARS. And so I just kept writing, because it was finally coming out. For those of you who have not the slightest interest, I'm sorry to have polluted your blog with that.
There's a number of reasons for this. One of them is simply to restore the balance. Another is to make sure that a forum intended to have a more general audience, doesn't narrow itself down and disappear.
But more importantly – there are certain subjects which tend to drive people crazy, even if there's truth behind them. Quantum mechanics would be the paradigmatic example; you don't have to go funny in the head but a lot of people do. Likewise Godel's Theorem, consciousness, Artificial Intelligence –
The concept of "Friendly AI" can be poisonous in certain ways. True or false, it carries risks to mental health. And not just the obvious liabilities of praising a Happy Thing. Something stranger and subtler that drains enthusiasm.