15 Comments

Totally agree.

I'm a numerate systems analyst working with engineers, and they really are uncomfortable/pyschologically averse to uncertainty. They spend all their time designing-out miniscule risk in a way that massively over-engineers and wastes time and resources.

Expand full comment

I think good scientists seek truth, good engineers seek solutions, and bad scientists and bad engineers seek certainty. Science has a bias towards certainty because of how papers are published, which in turn caters to the limited memory of humans. The right way to do science is to keep track of all the conditional probabilities, but that's too hard for humans to do.

Expand full comment

Realistically speaking any socio-economic system needs maintenance because no system is 100% waterproof (for example laws that reference "rational persons") so corruption and perversion are always possible and must be fought continuously, just look at how people managed to turn the ideas of the world's first hippee into a dogmatic, hateful and bigoted institution.

Expand full comment

Scientists, then, think of scientific practice in terms of engineering metaphors? Have any theory about why they'd do that?

Expand full comment

I think Drexler (or Robin) is playing on a stereotype of science that is both false (of science done properly) and true (describes the conscious beliefs of many scientists).

Expand full comment

I stand corrected on computer scientists.

A tangent on your difference with Drexler: I've wondered what it means when I think an explanation applies but the direction is reversed. Prima facie, it seems strange, doesn't it? One might think it would tend to discredit both sides. If you and Drexler agree that a conflict expresses the tension between scientific and engineering metaphors but disagree about which is which, doesn't that tend to suggest you both are imagining an application that doesn't exist? I mean, what explains why you and Drexler would agree that the wrong mindset is being applied but disagree about which one it is? The evidence can't be very strong for either mindset being the cause.

Maybe (hopefully) that's wrong: you can see that a problem is due to one or the other opposed mindsets without being clear on which one. I don't for the moment see how.

Expand full comment

"I think computer scientists count as engineers rather than scientists."

As a computer scientist, I say, "AAARGH!"

Computer science is science, not engineering. Computer programming and computer engineering are engineering. They are different fields. A computer programmer is not a computer scientist. Computer science basic research is a real thing, and like economics, maybe someday it would get funding, if more people realized that.

Anyway, my point was that in my experience, it's the mathematically-numerate scientists who are more comfortable with uncertainty. Engineers don't accept uncertainty; they over-engineer enough to bring themselves back into the realm of certainty. So I think the way Drexler says it has it backwards.

Expand full comment

I think computer scientists count as engineers rather than scientists.

Non-scientists and biologists forbid them from using imprecise knowledge, because they use engineering metaphors as their guide to how knowledge must be accumulated.

That's what Hanson does, isn't it? Isn't "small clues" basically an engineering metaphor? From a scientific standpoint, a small clue is unlikely to take you toward a true theory, but from an engineering standpoint, it can serve as the basis for useful adjustments.

Expand full comment

"When faced with imprecise knowledge, a scientist will be inclined to improve it, yet an engineer will routinely accept it."

I have always had exactly the opposite problem in my jobs. Mathematically-inclined scientists such as computer scientists want to use imprecise knowledge. Non-scientists and biologists forbid them from using imprecise knowledge, because they use engineering metaphors as their guide to how knowledge must be accumulated. They think that annotating a genome is like building a space shuttle, where one builds small components, tests them thoroughly in isolation, and then when absolutely sure that component works, goes on and builds components built out of those small components, and so on until they have a space shuttle. Whereas the mathematician understands that doing that throws out 99% of the data by the time you reach your conclusion.

A scientist builds robust systems by understanding the interactions in the data and ensuring mathematically that the odds of failure are tiny. An engineer builds robust systems by being conservative, working forward one step at a time and moving to the next step only when certain of success on the present step. This is the wrong way to work with information.

Expand full comment

Or some other misunderstanding. I was going to make the same point. It is the usual state of affairs in biology to have many studies on the same topic, and "it has been proven" means something like "four out of six studies concluded" (or, distressingly often, "four out of the six studies that I refer to in this paper, out of twenty studies on this topic, concluded).

Expand full comment

This is a bit of a side note, but one of the other reasons complex engineered systems do not fail is maintenance. This is something that we engineers take into account and mandate in our designs. It is one of the ways we bound the problem. Perhaps the idea of maintenance can be translated to the sphere of problems that economists seek to affect.

Expand full comment

If the added percent margin is too costly, science could be useful to adjust cost optimally. But, only if scientific knowledge for is available.

Expand full comment

The reasons may pose a difficult scientific puzzle, yet an engineer might see no problem at all. Add a 50 percent margin of safety, and move on - See more at: http://www.overcomingbias.c...The reasons may pose a difficult scientific puzzle, yet an engineer might see no problem at all. Add a 50 percent margin of safety, and move on - See more at: http://www.overcomingbias.c...The reasons may pose a difficult scientific puzzle, yet an engineer might see no problem at all. Add a 50 percent margin of safety, and move on - See more at: http://www.overcomingbias.c...

Expand full comment

"In science a single failed prediction can disprove a theory"

In theory, yes. In practice, a single failed prediction is more likely to show lax experimental standards.

Expand full comment

I'd like to see some evidence that your economics colleagues lack this kind of far-mode understanding of engineering. You seem to position yourself within the economics profession as the one with engineering knowledge, but you have never shown how your near-mode understanding of engineering translates into a distinctive far-mode view.

Expand full comment