What many people like about being religious is being part of a community built on the idea of being and doing good. They can meet and discuss how to be and do good, share practical tips and sometimes just do good together. That sure can feel great.
What many people dislike about other people being religious is their habit of presuming that if you aren’t religious in their way, you aren’t being or doing good; you are bad. Religious people often prefer similarly religious people to be their teachers, grocers, leaders, etc., because they can’t trust bad people in such roles and shouldn’t support bad people even if they can.
Many non- or otherly-religous folks say they have nothing against doing good, but say it is laughable to presume that people who are religious in your way are actually much better than others. Most religions do little to actually sort people by how much good they are or do; they mostly sort by loyalty, conformity, impressiveness, and local social status. Religions could sort people better if they spent lots of time together doing things most everyone agrees are clearly good, like healing the sick, but that is pretty rare.
My ex-co-blogger Eliezer Yudkowsy left this blog in 2009 to start the Less Wrong (LW) blog, which helped seed a growing community that sees itself self-consciously as “rationalists”. They meet online and in person and often discuss how to be more rational. Which is a fine goal. I’ve supported it by listing recent LW posts on the sidebar of this blog, and I’ve attended many LW-based social events. Some high status members of that community now offer (not-free) workshops where they teach you how to be more rational.
As with religion, the main problem comes when a self-described rationalist community starts to believe that they are in fact much more rational than outsiders, and thus should greatly prefer the beliefs of insiders. This happens today with academia, which generally refuses to consider non-academic beliefs as evidence of anything, and with political ideologies that consider themselves more “reality-based.”
Similarly, I’ve noticed a substantial tendency of folks in this rationalist community to prefer beliefs by insiders, even when those claims are quite contrarian to most outsiders. Some say that since most outsiders are quite irrational, one should mostly ignore their beliefs. They also sometimes refer to the fact that high status insiders tend to have high IQ and math skills. Now I happen to share some of their contrarian beliefs, but disagree with many others, so overall I think they are too willing to believe their insiders, at least for the goal of belief accuracy. For the more common goal of acceptance within a community, their beliefs can be more reasonable.
Some high status members of this rationalist community (Peter Thiel, Jaan Tallin, Zvi Mowshowitz, Michael Vassar) have a new medical startup, MetaMed, endorsed by other high status members (Eliezer Yudkowsky, Michael Anissimov). (See also this coverage.) You tell MetaMed your troubles, give them your data, and pay them $5000 or $200/hour for their time (I can’t find any prices at the MetaMed site, but those are numbers mentioned in other coverage). MetaMed will then do “personalized research,” summarize the literature, and give you “actionable options.” Presumably they somehow try to stop just short of the line of recommending treatments, as only doctors are legally allowed to do that. But I’d guess you’ll be able to read between the lines.
Of course that is usually what you pay doctors to do – study your charts and recommend treatment. And if you didn’t trust your main doctor, you could always get a second or third opinion. So why use MetaMed instead? The main evidence offered at the MetaMed site is data on high rates of misdiagnosis and mistreatment in medicine. Which of course means there is room for improvement via second and third opinions. But it doesn’t tell you that MetaMed is a relatively cost effective source of such opinions.
I wrote this post because I know several of the folks involved, and they asked me to write a post endorsing MetaMed. And I can certainly endorse the general idea of second opinions; the high rate and cost of errors justifies a lot more checking and caution. But on what basis could I recommend MetaMed in particular? Many in the rationalist community think you should trust MetaMed more because they are inside the community, and therefore should be presumed to be more rational.
But any effect of this sort is likely to be pretty weak, I think. Whatever are the social pressures than tend to corrupt the usual medical authorities, I expect them to eventually corrupt successful new medical firms as well. I can’t see that being self-avowed rationalists offers much protection there. Even so, I would very much like to see a much stronger habit of getting second opinions, and a much larger industry to support that habit. I thus hope that MetaMed succeeds.
Added 8:45p 23Mar: Sarah Constantin, MetaMed VP of research, replies to this post at Marginal Revolution (!):
Investigating your condition in depth, in the context of your entire medical history, genetic data, and personal priorities, may well turn up opportunities to do better than the standardized medical guidelines which at best maximize average health outcomes. That’s basically MetaMed’s raison d’etre. … Fundamentally the thing we claim to be able to do is give you finer-grained information than your doctor will. …
Robin Hanson seems to be implying that MetaMed is claiming to be useful only because we’re members of the “rationalist community.” This isn’t true. We think we’re useful because we give our clients personalized attention, because we’re more statistically literate than most doctors, because we don’t have some of the misaligned incentives that the medical profession does (e.g. we don’t have an incentive to talk up the benefits of procedures/drugs that are reimbursable by insurance), because we have a variety of experts and specialists on our team, etc. (more)
I was asking why pick MetaMed over ordinary medical specialists. I expect most doctors will disagree strongly with the claims that they don’t give patients personalized attention, only improve average health outcomes, and don’t offer the finest-grain advice available. But they could be wrong, and it would be great if MetaMed could show that somehow. On misaligned incentives, a reason to ask a different ordinary doctor for a second opinion is exactly that they can know they won’t get paid for any treatments they recommend.