19 Comments

Have you thought/written about what kind of socio-computational infrastructure would be practically required to support the em economy?

> Thus over time serial software will become less valuable, relative to ems and parallel software.

Ems are just a special category of parallel software, right? What kind of software are current human programmers?

Expand full comment

By historical standards, 50 is gigantic. In the 1950s, I don't think there were even 50 full-time programmers in the entire country. By the 70s, a typical programming team still consisted of one or two people. Team sizes in the single digits were standard practice as recently as the early 90s.

Expand full comment

Okay, that makes more sense.

Expand full comment

For the most part, at least historically, "new hardware" hasn't required new software.

That's why we write in high-level languages, not machine code.

(You can take advantage of new features with new software, sometimes. But often it's just automatic, or the changes are far less than a complete re-write.)

Expand full comment

"The increasing quantity of software purchased has also led to larger software projects, which involve more engineers."

Do you have a citation for this claim? At first glance it sounds wrong to me. I would say that better hardware has lead to programming environments that favor programmer productivity over machine efficiency (e.g. Python over C). This combined with the proliferation of open-source software libraries has made it easier than ever for a small team to make a big impact. For example, WhatsApp had a team of ~50 people and was acquired for billions of dollars. And Instagram had a double-digit number of employees and was acquired for a nine-digit number of dollars. Silicon Valley at its best is a small, egalitarian team of highly intelligent engineers getting to know their customers thoroughly and doing careful, focused work to serve their needs.

Expand full comment

To know how to accelerate all the elements by the same amount, you have to individuate the elements. The question seems to be whether you can read off the real elements from scan-based emulations.

Expand full comment

After reading it, I would recommend reducing the amount of technical jargon if this is meant for popular consumption; even when some details are lost in the translation. Examples:Serial = network speedsParallel = processing powerHigh-level abstraction = user-friendlyTools = programming software

Expand full comment

Duke Nukem Forever was designed on at least 3 different engines. Each time, this forced the developers to redo the majority of their work.

Expand full comment

It depends on the accuracy needed. You can certainly simulate neurons faster than real neurons go (although wiring up enough hardware might be a challange, it's not insurmountable). Even if you want to solve some analogue partial differential equations for how signals and chemicals travel, that can probably be done faster than real time. Detailed biochemistry, not so much I would guess.

Expand full comment

There are languages in which some type checking is done at run time.

Expand full comment

My claim is that the vast majority of effort in the field of software development is directed at problems that are not unique to bad programmers.

I claim that instead, the big problems in software development (and the ones the most effort is spent on ameliorating) are the sort of problem that affect good and bad programmers ~equally. Tools that mitigate these problems therefore are of ~equal benefit to good and bad programmers.

Consequently, changes in the distribution of programmer skill do not have strong effects on the optimal allocation of effort towards tools, nor even the allocation among the different types of tools.

Expand full comment

I didn't say that all software tools are for dummies. And the claim you saw as about hardware vs. software is that em-running software will improve less fast than other software. Surely many tools are designed with an eye for the distribution of intelligence in the population of users.

Expand full comment

Nit pick:

> Tools like **automated type checking** and garbage collection would tend to be done in parallel, or not at all.

This seems weird to point out. Type checking is done _when the programmer writes the code_, not when the program is executing. Even if it were completely serial, compiling even a very large program is going to be billions of times less computationally intensive than simulating a human brain. Google recompiles their _entire codebase_, generating hundreds of thousands of artifacts, every few seconds.

> When software can be written quickly via very fast software engineers, product development could happen quickly, even when very large sums were spent.

Why is software development sped relative to changes in business conditions? Are you arguing that the business world is more serial than the software development world [and therefore harder to speed up](http://en.wikipedia.org/wik... Why is the business world harder to parallelize?

Expand full comment

> There would be little interest in tools and methods specialized to be useful “for dummies.”

To a first approximation these tools and methods do not exist.

Software design patterns are not used to prevent bad programmers from making mistakes. Software design patterns are used to prevent *all* programmers from making mistakes.

Writing large pieces of software involves computational problems that are genuinely difficult. Tools like abstraction aren't just crutches for non-geniuses; they reduce the complexity of the problem for all programmers. Thus even a community of brilliant em programmers will continue to rely on them. Programmer ability gains will be largely allocated towards making more powerful or complicated software rather than towards reducing reliance on design patterns.

I also question the implicit assumption that ems will make us better at writing software faster than they make us better at inventing faster hardware.

Expand full comment

"What kind of computing project would be undermined rather than aided by improved* iron under it?"

The software may need to be rewritten to compete against other programs that use the improved hardware. If the rewriting process is too slow, the hardware could improve fast enough that the software just keeps lagging farther and farther behind.

I hear that Duke Nukem Forever had a version of this problem.

Expand full comment

The relationship between elements doesn't matter if you accelerate them all by the same amount.

Expand full comment