33 Comments

If anybody wants to point me to a good source claiming that parallel computing will slow economic impacts, I'd say thanks. I've found a few things like this:

http://www.cs.berkeley.edu/...http://www.economist.com/no...

But seems pretty ambiguous. This early paper seems to suggest many people thought parallel computing would be good. Could it be that expectations are just diminishing to moderate levels from too-high ones?

http://citeseerx.ist.psu.ed...

Expand full comment

Die shrinkage decreased cost per transistor - same steps produced more of the more energy efficient transistors. 3D layering does none of that, and as such, is nearly irrelevant (absent dramatic improvements in cost of the process).

Expand full comment

Statistical/informational entropy and classical thermodynamical entropy are related by Boltzmann's constant. S = k log W

No. Boltzmann's constant relates statistical mechanical (Gibbs) and informational (Shannon) entropy. Classical thermodynamical entropy is a related but different concept: http://en.wikipedia.org/wik...

And I think you mean any *currently* practical computer

No.

Expand full comment

Statistical/informational entropy and classical thermodynamical entropy are related by Boltzmann's constant. S = k log W

And I think you mean any *currently* practical computer

Expand full comment

Entropy is the logarithm of the number of possible arrangements of quanta of energy among the allowed degrees of freedom.

I can't really parse that. Classical statistical mechanical/informational entropy is defined in terms of Gibbs/Shannon entropy, which are both generalized to quantum statistical mechanics/quantum information theory in terms of Von Neumann entropy.

Statistical/informational entropy and classical thermodynamical entropy are related but distinct concepts.

But that's beside the point. We were discussing computers. In any practical computer, you can't use all the physical degrees of freedom to perform computation.

You have a logical state space (a finite space in the classical case, or a finite-dimensional Hilbert space in the quantum case) which is implemented on a much bigger (infinite-dimensional for all practical purposes) physical state space. Each logical state corresponds to a region of the physical state space. Implementing reversible logical operations on a physically irreversible system is trivial, but doesn't yield any energy efficiency gain: logical bits are preserved, but physical entropy is still being generated.

In order to get arbitrarily small, ideally zero, energy consumption you need physical reversibility, or an arbitrarily close approximation of that. Such systems can't implement irreversible logic operations, hence the interest in reversible logical operations and how to convert irreversible algorithms to reversible ones.

But I wanted to point out that physically reversible systems don't exist (with the possible exception of the universe as a whole) and approximating them is as hard as approximating perpetual motion machines for the very same thermodynamical reasons.

Expand full comment

Yes you are right. As far as I know the only practical use for quantum computers would be the simulation of quantum systems. Although there are still necessary quantum algorithms missing for this. The last noticeable progress in the area I know of is already two years old:http://vcq.quantum.at/press...

Expand full comment

Theoretically, yes.

Practically, it depends on how much of a big thing factorization is. Currently, the only one application seems to be breaking certain cryptographic primitives like the RSA algorithm, and replacements for these primitives are already available.

Expand full comment

Protein folding is NP-hard in the general case. NP-hard problems are conjectured to be hard even for quantum computer

(specifically, it is conjectured that P < BQP < NP, where BQP means Bounded error Quantum Polynomial time).

Specific instances of the protein folding problem (many of the biologically relevant ones, in particular) may be tractable even by classical computers. Quantum computers may or may not provide practical performance improvements in this area, depending on technical details currently impossible to predict.

Expand full comment

Another consideration must be factored in, waste heat. The more compact a circuit becomes, especially a 3d one the harder it will become to dump heat. I've seen a few Journal articles that suggest this might be as hard a limit for circuit density as the light speed limit is for practical size of computers.

Expand full comment

I don't think digitizing heat (a.k.a reversible computing) will be much of a thing. You do gain the ability to pipe heat around better - but that's about it. Nobody is going to bother actually running their machines backwards to recover a bit of wasted energy - and without that, reversible machines will just get clogged up with digital heat, and you gain little. I have a page giving more details on this topic: "Cellular Automata, Reversible Logic".

Expand full comment

Everything indicates that the brain doesn't use any quantum mechanical effects (superposition or entanglement).

Expand full comment

"NQPC" can't "radically" accelerate any computation since it is still a classical computer. E.g. if you double the cores of a super computer it will double i's speed at best. Or if you for example search a unsorted database the time complexity is always O(N). Regardless of whether you use a classical computer with hardware parallelization or not.

Expand full comment

Grover's algorithm and similar ones do "only" provide a quadratic speedup on several problems. Which could be nice but is no really big thing. In contrast, Shor's algorithm provides a exponential speedup for factorization. (Which IS a really big thing.)

Expand full comment

Is it? Entropy is the logarithm of the number of possible arrangements of quanta of energy among the allowed degrees of freedom. That is, the number of bits required to specify the system's exact state. It *is* a kind of information in a very real, physical sense.

Expand full comment

Well, we know for sure brain emulations can run on small computers: the human brain is the proof and without a body with muscles and nerves the brain could be even smaller. I don't think we need quantum computers to build an artificial brain. If we can figure out the algorithms the brain uses that'll be a big step already.

Expand full comment

I wouldn't think factoring-type problems or general quantum simulation are amenable to NQPC. Whether a database search can be done faster with QC (Grover's algorithm) vs NQPC is going to depend on something like the marginal technical cost of additional qubits vs. additional processors. I have terrible intuition here but my guess is NQPC would win.

Expand full comment