In 1983, NASA was planning to bring back Martian soil samples to Earth. Contaminating the Earth with alien organisms was an issue, but engineers at Jet Propulsion Laboratories had devised a "safe" capsule re-entry system to avoid that risk. However, Carl Sagan was
cite Castle Bravo or something, not the Titanic.
I like your idea. It's done.
Zubon, your comments are good, and I can't come up with particularly good ideas for dangerous testing of grey goo and strange matter. Probably the best is to hand the developed technology over to terrorists.
Speaking of terrorists and testing, I've added an addendum to my post for dealing with terrorism and other similar issue where a direct "dangerous test" is not possible.
Global warming is more nebulous and political, so a specific dangerous test is hard to imagine. A bastardised version of dangerous testing may be to require either a massive increase or decrease of greenhouse gasses, no in between allowed. So those who argue against greenhouse gas reductions would have to be much more certain of their case. But that isn't really the same thing.
With even a small risk of grey goo, what would be the proper test for self-replicating nanotechnology? There are occasional scientific experiments that have some infinitesimal chance of destroying all the matter in the universe or at least all life on Earth. What additional tests might we have done before fiddling with strange matter?
The space capsule example seems like an easy case because there is an obvious safe way to test it (with something other than anthrax). We do a great many things that bear some tiny risk of eliminating our species (or more), and I cannot see how you test most potential rends in the universe without the risks inherent in making them.
There are claims that global warming presents an existential threat for humans. Are there dangerous tests we should be applying there?
So rather than set some arbitrary rule about how much testing should be done, it is better to make sure they have the right incentives to do the appropriate amount of testing. Appropriate incentives can be set by the legal liability regime.
But the fact that there was a risk of contamination from a Martian organism should have been all the incentive JPL engineers needed to be totally sure of themselves (I doubt they would have been more afraid of alien pandemic + legal liability than simply an alien pandemic).
However their reaction to Sagan's proposal suggested they still weren't totally sure of themselves. They were overconfident that things would go right, and it took a specific image (anthrax loose on the world through their fault) to make them doubt this.
Overconfidence is especially a problem when the risk is very low, and when the risk is very low due to other reasons that the security of the design (in this case, the fact that there probably aren't dangerous organisms on Mars). Dangerous testing gives them (engineers and pointy haired types) a specific image of what could go wrong, and puts the entire onus only on their system. Both help overcome overconfidence.
Well, it would be very expensive to run such a test (launching something into space isn't cheap), even if you decided that such a test would be an appropriate way to convincingly demonstrate that the capsule was safe.
If there was a risk (as some scientists thought) of an alien organism starting a pandemic, then the cost is trivial. And the dangerous test is not really a test at all - it's there to make sure that all proper tests have been done beforehand, and all possibilities of failure considered.
I'm sure we would not have had the Titanic. I suspect we also would not have had fire.
I'm only suggesting this sort of testing for things we need to be "absolutely safe" - where one single failure would be a disaster. I don't suggest we do so for anything where we can accept a level of risk (whether the Titanic qualified for needing to be "absolutely safe" is an ethical and economic question, not an engineering one).
The point of testing is that you arrange things so as to try different contingencies under conditions that minimise the bad consequences. Presumably, you'd suggest that (say) Cisco should push out updates to all the Internet routers in the world without doing any tests first - that way, they'd be more careful..
Dangerous testing would presumably motivate pointy-haired bosses as well - anyone up the decision and responsibility hierarchy.
The law and economics literature is the right place to look for answers to this sort of question. How careful an analysis one should do of any risk should depend on how large is the harm from a failure, which varies from case to case. So rather than set some arbitrary rule about hos much testing should be done, it is better to make sure they have the right incentives to do the appropriate amount of testing. Appropriate incentives can be set by the legal liability regime. The right liability rule for any one area of behavior depends on who has what ability to prevent harm, and how easily the courts can determine whether the right kinds of precautions were taken.
The Titanic is commonly blamed on engineers, but was built in an era of cutthroat competition between ocean liners. Early iron steamships - where the engineers got their full head to take whatever precautions they liked without worrying about shaving one penny more than their competitors - were actually much safer. The part about "Guaranteed not to crash!" was marketroid hype.
Most "engineering" disasters are management disasters, though there are exceptions. I never heard the Tacoma Narrows bridge blamed on pointy-haired bosses. Still, if you're going to cite an example of engineering overconfidence, cite Castle Bravo or something, not the Titanic.
What would have been the effects of such a policy had it been carried out in the past? I'm sure we would not have had the Titanic. I suspect we also would not have had fire.