Me: Current growth rates simply cannot continue at familiar levels for ten thousand more years. We’ll eventually learn everything worth knowing about how to arrange atoms, and growth in available atoms will be limited by the speed of light.

Maximally efficient maximally growing physical economy can be represented as a surface of a sphere that expands at the speed of light in space It consumes all matter and energy with maximum efficiency. Inside the expanding sphere there is only maximum entropy (infrared radiation) that can't be used for anything.

The size of the physical economy is relative to the surface of the expanding sphere. The growth of the surface as a function of the radius of the sphere is: 8πr.

Assuming that trivial immaterial growth does not provide significant value and competitive and intellectually greedy culture strives towards maximum complexity and information processing capability, the limits of immaterial growth are related to the limits of computation.

Virtual world expansion would have diminishing returns. So too would detail improvements, at least if we assume some ceteris paribus (without enhancements, I will not perceive much improvements below the microscopic level, as already mentioned).

This seems true in gaming now, AFAIK. I'm a gamer, and in my experience making the number of available towns/planets keep doubling would quickly stop improving my gaming experience. Making it more detailed would have diminishing returns but those would't really start to kick in hard until it approached being indistinguishable from real life.

Either way, I don't see how infinite virtual expansion would help much (as much as I'd love gaming to improve beyond what it currently is like).

Human beings are not driven to maximize pleasure. Instead, pleasure is a tool to drive human beings to maximize other things (i seem to think we are currently unsure exactly what, though). Bryan seems to imply that improve must mean have more pleasure, but we are much more complicated than that...

If it turns out that baryon number is not conserved, with the result that we can convert neutrons and protons into photons, we should count photons instead of atoms. Given that there are over 20 orders of magnitude worth of frequency to work with, we may be able to store 10^30 bits per photon, with the result that the neutrons and protons in our Sun, converted into photons of average energy blue, could store about 10^97 bits. Of course, we will need at least a few leptons around to actually process that information and recycle photons. That's classical data storage -- entangled photons may be far better still, at least in terms of raw information capacity.

That gives us some more room at the bottom, but still sub-exponential unless, as mentioned, we rarely collapse qubit superpositions (i.e. if qubit-minds rarely actually think about anything but just keep storing more memories).

If world population growth continued at today’s rate, in only 1100 years every drop of water on earth (including oceans) would be locked up in the makeup of human flesh.

The world is growing at about 78 million people per year. If growth continued at that rate for 1100 years, the world population would be about 90 billion.

I think you think world population is growing exponentially, but it's not. It hasn't been been growing exponentially for several decades.

The very act of using space more efficiently creates a problem for which there is no solution: it inevitably begins to drive down per capita consumption and, consequently, per capita employment, leading to rising unemployment and poverty.

Computers have been improving in space utilization more than just about any industry for many decades. Not only is our consumption of computers not slowing down, it's increasing. This is leading to more employment and less poverty (ask the employees of Microsoft...especially those who bought MS stock).

As a spin-off, I did a calculation of the inevitable thermodynamic costs of expanding a civilization: http://tinyurl.com/l55d8b

It turns out that a "classical" civilization that just converts matter into information storage has no real limit to its expansion speed. However, a civilization close to the Bekenstein bound in terms of information storage has such an enormous entropy cost in converting matter that its expansion is actually heat dissipation limited!

Still, dealing with the entropy output of an advanced civilization is going to be challenging. Good waste management will just grow in value.

While the Bekenstein bound and similar considerations put an upper limit on the number of distinguishable states in a region of space with a certain amount of matter, it does not place any limit on their value (as noted above). However, the value has to be experienced by entities inside this region. There is also a finite bound on the number of distinguishable utility functions. However, this number is very large: even fairly simple computational systems can calculate very complex functions, and here the evaluation system could potentially be as big and complex as the entire space region (leaving little but itself to evaluate).

I wonder whether one can get more utility by having a lot of agents with evaluation systems that are relatively simple, or having few agents with very complex evaluations?

> If space is continuous, then a single spin-zero particle has an> uncountably infinite dimensional Hilbert space even though it > is said to have 3 degrees of freedom.

Yes, sorry for the sloppy terminology.

> I think that classical computational power is still bounded in > continuous space because of experimental error, which may > have entropy-imposed minimums. I’m not sure about this.

This is a good point. I retract my statement that continuous systems "presumably" have infinite computational power. In fact, Jose Felix Costa and some others have a body of work showing that in a classical universe where higher precision measurements take more time to perform the computational power is reasonable.

Incidentally, I am currently visiting UCSB, if that is where you are. Email me if you'd like to have lunch or something.

I've written about growth and models at 'how to spot a model that actually works' and at 'models, reality and the limits to growth'. From the latter: "Limits – ‘real’ limits – are only ever met in the concrete and the particular, not in the universal and abstract. Therefore, what concrete and particular limits have to tell us about the world in general is always open to interpretation and argument."

It is like saying that if you flip 100 coins then you have created 2^100 parallel universes

Branching/decoherence happens all the time, at a furious pace, whether we flip coins or not (or measure spins or not). Our measurements do not significantly alter the rate of branching.

However, you are right that the only sensible way to value future states is through the Born weights.

If the universe has finitely many degrees of freedom, then the corresponding space of quantum states is finite dimensional, and the whole system, could in principle be simulated with finite computational power by storing and manipulating a gigantic state vector. If the universe has infinitely many degrees of freedom then it presumably has infinite computational power already classically.

If space is continuous, then a single spin-zero particle has an uncountably infinite dimensional Hilbert space even though it is said to have 3 degrees of freedom.

I think that classical computational power is still bounded in continuous space because of experimental error, which may have entropy-imposed minimums. I'm not sure about this.

Whether a system is in superposition or not is a basis-dependent statement. A spin can be in a superposition of up and down (two branches), but if you consider its orientation along the left right axis the same state of the spin can correspond to being oriented right only (one branch). “How many branches” is just a wrong question.

This is true near the atomic level, where branching is fuzzy and not well-defined. But as you approach the macroscopic level, the branches become well-defined (though not necessarily discrete). Otherwise, we wouldn't be able to agree on what our current classical world was.

In other words, it's not sensible to say whether a single spin is in a superposition or not (since it's basis dependent), but it is sensible to say that we are in the (discrete) branch where the spin was measured to be spin up.

Although a single spin doesn't have a preferred basis, macroscopic systems (i.e. the classical world) do. This basis is preferred througheinselection, and the origins can be traced back to the form of the interaction Hamiltonian. For most classical systems, the Hamiltonian has a momentum-dependent part and a position-dependent part. The einselected basis is then roughly the set of states which approximately diagonalize in both position and momentum: minimum uncertainty wavepackets.

If world population growth continued at today's rate, in only 1100 years every drop of water on earth (including oceans) would be locked up in the makeup of human flesh.

The biggest obstacle we face in changing attitudes toward overpopulation is economists. Since the field of economics was branded "the dismal science" after Malthus' theory, economists have been adamant that they would never again consider the subject of overpopulation and continue to insist that man is ingenious enough to overcome any obstacle to further growth. Even worse, economists insist that population growth is vital to economic growth. This is why world leaders continue to ignore population growth in the face of mounting challenges like peak oil, global warming and a whole host of other environmental and resource issues.

But because they are blind to population growth, there's one obstacle they haven't considered: the finiteness of space available on earth. The very act of using space more efficiently creates a problem for which there is no solution: it inevitably begins to drive down per capita consumption and, consequently, per capita employment, leading to rising unemployment and poverty.

If you‘re interested in learning more about this important new economic theory, then I invite you to visit either of my web sites at OpenWindowPublishingCo.com or PeteMurphy.wordpress.com where you can read the preface, join in the blog discussion and, of course, buy the book if you like.

The individual virtual realities grow very slowly, like the harmonic sequence, while the overall civilization grows like an expanding sphere.

I personally want to believe that the universe has the structure of the surreal numbers.

https://5677451.blogspot.se...

summary:

Maximally efficient maximally growing physical economy can be represented as a surface of a sphere that expands at the speed of light in space It consumes all matter and energy with maximum efficiency. Inside the expanding sphere there is only maximum entropy (infrared radiation) that can't be used for anything.

The size of the physical economy is relative to the surface of the expanding sphere. The growth of the surface as a function of the radius of the sphere is: 8πr.

Assuming that trivial immaterial growth does not provide significant value and competitive and intellectually greedy culture strives towards maximum complexity and information processing capability, the limits of immaterial growth are related to the limits of computation.

Virtual world expansion would have diminishing returns. So too would detail improvements, at least if we assume some ceteris paribus (without enhancements, I will not perceive much improvements below the microscopic level, as already mentioned).

This seems true in gaming now, AFAIK. I'm a gamer, and in my experience making the number of available towns/planets keep doubling would quickly stop improving my gaming experience. Making it more detailed would have diminishing returns but those would't really start to kick in hard until it approached being indistinguishable from real life.

Either way, I don't see how infinite virtual expansion would help much (as much as I'd love gaming to improve beyond what it currently is like).

Human beings are not driven to maximize pleasure. Instead, pleasure is a tool to drive human beings to maximize other things (i seem to think we are currently unsure exactly what, though). Bryan seems to imply that improve must mean have more pleasure, but we are much more complicated than that...

sounds like warfare's not going anywhere

If it turns out that baryon number is not conserved, with the result that we can convert neutrons and protons into photons, we should count photons instead of atoms. Given that there are over 20 orders of magnitude worth of frequency to work with, we may be able to store 10^30 bits per photon, with the result that the neutrons and protons in our Sun, converted into photons of average energy blue, could store about 10^97 bits. Of course, we will need at least a few leptons around to actually process that information and recycle photons. That's classical data storage -- entangled photons may be far better still, at least in terms of raw information capacity.

That gives us some more room at the bottom, but still sub-exponential unless, as mentioned, we rarely collapse qubit superpositions (i.e. if qubit-minds rarely actually think about anything but just keep storing more memories).

If world population growth continued at today’s rate, in only 1100 years every drop of water on earth (including oceans) would be locked up in the makeup of human flesh.

The world is growing at about 78 million people per year. If growth continued at that rate for 1100 years, the world population would be about 90 billion.

I think you think world population is growing exponentially, but it's not. It hasn't been been growing exponentially for several decades.

The very act of using space more efficiently creates a problem for which there is no solution: it inevitably begins to drive down per capita consumption and, consequently, per capita employment, leading to rising unemployment and poverty.

Computers have been improving in space utilization more than just about any industry for many decades. Not only is our consumption of computers not slowing down, it's increasing. This is leading to more employment and less poverty (ask the employees of Microsoft...especially those who bought MS stock).

As a spin-off, I did a calculation of the inevitable thermodynamic costs of expanding a civilization: http://tinyurl.com/l55d8b

It turns out that a "classical" civilization that just converts matter into information storage has no real limit to its expansion speed. However, a civilization close to the Bekenstein bound in terms of information storage has such an enormous entropy cost in converting matter that its expansion is actually heat dissipation limited!

Still, dealing with the entropy output of an advanced civilization is going to be challenging. Good waste management will just grow in value.

Tim: The Beckenstein bound puts hard limits on the available room at the bottom.

While the Bekenstein bound and similar considerations put an upper limit on the number of distinguishable states in a region of space with a certain amount of matter, it does not place any limit on their value (as noted above). However, the value has to be experienced by entities inside this region. There is also a finite bound on the number of distinguishable utility functions. However, this number is very large: even fairly simple computational systems can calculate very complex functions, and here the evaluation system could potentially be as big and complex as the entire space region (leaving little but itself to evaluate).

I wonder whether one can get more utility by having a lot of agents with evaluation systems that are relatively simple, or having few agents with very complex evaluations?

> If space is continuous, then a single spin-zero particle has an> uncountably infinite dimensional Hilbert space even though it > is said to have 3 degrees of freedom.

Yes, sorry for the sloppy terminology.

> I think that classical computational power is still bounded in > continuous space because of experimental error, which may > have entropy-imposed minimums. I’m not sure about this.

This is a good point. I retract my statement that continuous systems "presumably" have infinite computational power. In fact, Jose Felix Costa and some others have a body of work showing that in a classical universe where higher precision measurements take more time to perform the computational power is reasonable.

Incidentally, I am currently visiting UCSB, if that is where you are. Email me if you'd like to have lunch or something.

I've written about growth and models at 'how to spot a model that actually works' and at 'models, reality and the limits to growth'. From the latter: "Limits – ‘real’ limits – are only ever met in the concrete and the particular, not in the universal and abstract. Therefore, what concrete and particular limits have to tell us about the world in general is always open to interpretation and argument."

It is like saying that if you flip 100 coins then you have created 2^100 parallel universes

Branching/decoherence happens all the time, at a furious pace, whether we flip coins or not (or measure spins or not). Our measurements do not significantly alter the rate of branching.

However, you are right that the only sensible way to value future states is through the Born weights.

If the universe has finitely many degrees of freedom, then the corresponding space of quantum states is finite dimensional, and the whole system, could in principle be simulated with finite computational power by storing and manipulating a gigantic state vector. If the universe has infinitely many degrees of freedom then it presumably has infinite computational power already classically.

If space is continuous, then a single spin-zero particle has an uncountably infinite dimensional Hilbert space even though it is said to have 3 degrees of freedom.

I think that classical computational power is still bounded in continuous space because of experimental error, which may have entropy-imposed minimums. I'm not sure about this.

Whether a system is in superposition or not is a basis-dependent statement. A spin can be in a superposition of up and down (two branches), but if you consider its orientation along the left right axis the same state of the spin can correspond to being oriented right only (one branch). “How many branches” is just a wrong question.

This is true near the atomic level, where branching is fuzzy and not well-defined. But as you approach the macroscopic level, the branches become well-defined (though not necessarily discrete). Otherwise, we wouldn't be able to agree on what our current classical world was.

In other words, it's not sensible to say whether a single spin is in a superposition or not (since it's basis dependent), but it is sensible to say that we are in the (discrete) branch where the spin was measured to be spin up.

Although a single spin doesn't have a preferred basis, macroscopic systems (i.e. the classical world) do. This basis is preferred througheinselection, and the origins can be traced back to the form of the interaction Hamiltonian. For most classical systems, the Hamiltonian has a momentum-dependent part and a position-dependent part. The einselected basis is then roughly the set of states which approximately diagonalize in both position and momentum: minimum uncertainty wavepackets.

If world population growth continued at today's rate, in only 1100 years every drop of water on earth (including oceans) would be locked up in the makeup of human flesh.

The biggest obstacle we face in changing attitudes toward overpopulation is economists. Since the field of economics was branded "the dismal science" after Malthus' theory, economists have been adamant that they would never again consider the subject of overpopulation and continue to insist that man is ingenious enough to overcome any obstacle to further growth. Even worse, economists insist that population growth is vital to economic growth. This is why world leaders continue to ignore population growth in the face of mounting challenges like peak oil, global warming and a whole host of other environmental and resource issues.

But because they are blind to population growth, there's one obstacle they haven't considered: the finiteness of space available on earth. The very act of using space more efficiently creates a problem for which there is no solution: it inevitably begins to drive down per capita consumption and, consequently, per capita employment, leading to rising unemployment and poverty.

If you‘re interested in learning more about this important new economic theory, then I invite you to visit either of my web sites at OpenWindowPublishingCo.com or PeteMurphy.wordpress.com where you can read the preface, join in the blog discussion and, of course, buy the book if you like.

Pete MurphyAuthor, "Five Short Blasts"