My friend and colleague Tyler Cowen is smart, well-traveled, writes on cultural diversity, and manages a large organization. If his political writings forced him to flee for his life to live in "off the grid" in a distant foreign land, with only a small chance of success and even less of returning, I expect he'd take a very practical approach, and try with all his considerable strength. Tyler's wife would try hard to help him, easily preferring the uncertainty of never knowing if he made it over the certainty of turning him in to certain death. Even imagining the remote prospect of such a situation years ahead of time, I expect Tyler would be pretty rational and practical about this scenario.
But when Tyler considers the prospect of fleeing for his life into the future via cryonics, he thinks very differently:
[On cryonics] my current view is this: one's attention is extremely scarce and limited, as are one's affiliations. Insofar as you have the luxury of thinking "bigger thoughts," those thoughts should be directed at helping others, not at helping oneself. … Furthermore the universe (or multiverse) may be infinite, so in expected value terms it seems my copies and near-copies are already enjoying a kind of collective immortality. … What probability of future torture would cause us to wish to die forever rather than be resurrected? And should I therefore be scared by the idea of an infinite universe? Do Darwinian selection pressures — defined in the broadest possible way — suggest it is worth spending energy on making entities happy? Or do most entities end up as suffering slaves?
Huh? Can you imagine Tyler giving himself up to be killed for his writings because maybe other Tylers exist in an vast universe, because maybe he'd be tortured in a foreign land, or because saving his live would be a selfish "big thought"? No, like the woman in Monty Python's "Can we have your liver?" sketch, cowed into giving her liver after hearing how vast is the universe, Tyler has succumbed to the severe human bias to think about distant times and places in impractical abstract symbolic terms.
Though I think they are mistaken, I can at least respect those, like Bryan Caplan or Penn and Teller, who reject cryonics because they think it has too little chance of working. But most other reactions seem just bizarre.
You don't have to be rich or selfish to be a cryonaut. I want to get to the future to help a disabled boy. He will be then much older and need my help. If they could they would revive me because I would have rare gold coins hidden in a National Park. There is only one way they can get to it in the future.
Basiec
This is about people who reject the hope of cryonics as not worth the trouble. It is a separate question from the probability of success. And I absolutely defend my right to dilute my assessment of the value through distancing techniques.
Let's take the flee-to-another-country metaphor. You secretly learn that a tsunami will hit your city in a short time. You know that you cannot convince all your friends that this will happen. For some reason, there is only one way out of town: a single truck, which will take you to a place you have never been, which you may or may not like. In order to be loaded on the truck, you have to agree to be anaesthetized and put in a box. But there are more people in line for the procedure than there are spaces, so what they are going to do is anaesthetize everyone and then randomly choose which ones to load on the truck. There is also some opportunity cost for agreeing; say, you give up the chance to live it up/help people/write the perfect haiku/understand general relativity in your precious remaining time.
OK, the details need work, but the aspects are all there: on the one hand death, on the other uncertain value, imperfect probability, temporary total loss of control, opportunity cost. There are clear models (expected value) for discounting for all of the latter factors, except temporary total loss of control. For me, that break in the causal chain needs a discount too. What I am doing is *deliberately* using the it to dilute my sense of self and so assume a more universal standpoint: here ends me, the only thing that matters after that is world peace/maximum fun/ whatever.
So would I forego life-saving surgery with total anaesthesia? Well, yes, actually, the fact of total anaesthesia would probably significantly alter the opportunity cost/success probability package I'd accept - say, I'd need a 15% instead of a 10% chance for the surgery to work perfectly before committing half of my current and future savings to it. And yes, if the doctor was someone I could know nothing about before the surgery - if in fact I'd never met a surgeon in my life, or ever heard of anybody surviving anaesthesia - that would also make me increase my price to accept the deal - say, 25% in the same case. And if the surgery were opt-in and the death it prevented years in the future, sure, 50%. It may seem that I value my life especially poorly, as many people would give their whole life savings for a smaller chance of perfect recovery. Though I admit I am possibly exaggerating, I am pretty sure I'm atypical in the direction I signal - not that my life is cheap, but that that I value quality (opportunity to help others included) over quantity, and the cost of the surgery is ultimately expressed in foregone quality.
The point is, I am perfectly willing to live with a bias here, an extra penalty factor on the expected value of cryonics because of passing through what my experience counts as death, the universal price for birth. Because it is just using one bias - deliberate distancing to use "far" thinking - to overcome another - my bias for personal survival, instead of maximum worldwide fun and minimum suffering.
I don't know if that's Tyler's conscious reasoning, but given that he refers to the ultimate distance - his copies and near-copies in distant corners of the multiverse - I think that that is what he's striving for here.
Would I have come to this conclusion in a world where freezeheads were as common as religious folk in this one, so I could opt out and use the money for myself and the poor? Maybe not, but quite possibly.