Perhaps the vagueness in the past hypothesis is the problem.

Which seems to me to basically say that this anomaly is about as bad as it could possibly be.

Let us say instead that the anomaly is exactly as bad as it could possibly be: the large-scale entropy of the universe at the Big Bang is zero. This is a very strong claim, and (like all extreme statements) it's simple too. It's too strong, compared to the evidence, to justify stating it without qualification, but as a hypothesis I like it.

While I'm here, I'll recommend Huw Price's 1996 book Time’s Arrow and Archimedes’ Point on the arrow of time, although I don't agree with Price about everything.

And I agree that the silence about this, the idea that the second law is fully understood, is a scandal.

Paul, the anthropic principle explains the low-entropy of the origin. That explanation trumps Occam's razor. Even if low-thermodynamic states had long descriptions, we would still see them at the origin - since otherwise, observers would not have evolved.

Stuart, yes I recall Penrose's flat past hypothesis; is it still considered viable?

Hard to tell. It is very speculative, and Penrose is reaching the end of his career, and some people are clearly humouring him... But there is some weak evidence supporting it, and it does appear to be testable.

And the mathematics appear correct - in the absence of the Ricci tensor, the slow, infinite, cold end of the universe is indistinguishable from the fast burning expansion of the big bang...

@Tim Tyler: I follow that, but it doesn't matter: obviously the concisely-describable high entropy states are a tiny fraction of their total number, so it's *still* justified to assign a non-negligible prior to a low-entropy state if you're using Solomonoff induction.

You've got to be very careful how to set up the boundary conditions, even in a very simplified classical regime. For example, any collection of particles assembled in a finite volume of space, no matter how large, and allowed to expand outward for whatever length of time you care to specify into a vacuum of infinite space will be in a condition of 'low entropy' in comparison to later times. It's also not very difficult to define conditions so that the total energy of the universe is zero. Thus, a localized 'quantum fluctuation' in infinite space and time would under these conditions quite nicely start out in a condition of 'low' entropy and propagate forward in time into the infinite future in a very natural way.

I'm not endorsing this simplified model at all, btw. Just pointing out that reasoning by analogy to a more familiar situation can lead to apparently nonsensical results.

Sean, the "past hypothesis" seems to me more of restatement of our failure than an actual solution. And I'd be more comfortable accepting it if I saw some formal calculations showing what it implied; the hand-waving makes me nervous.

The low-entropy-past hypothesis is neither a full solution nor a restatement of our failure; it is a partial explanation. It is like starting with the question "Why is this room hot?" and getting the answer "because there is a thermostat connected to a heating device, which produces heat until the room is brought to a certain temperature". Sure, this explanation prompts the question "why is the thermostat set so high?", but that doesn't mean the explanation is just a useless restatement of the question.

In comparing the low-entropy-past hypothesis to the fictional "life appeared even though it seems unlikely" law, you ignore the substantial predictive power of the former.

Robin, this is false, weakly so in the first case[1], strongly so in the second. Please read what I write.

[1]Unless you have, for example, the definitive explanation on whether the universe is open or closed, etc. You are confusing local, gravity-free cases with global, non-gravity free cases. Iow, don't extrapolate from a gas enclosed inside a jar.

It almost seems like people are iterating through all possible misunderstandings, suggesting I should have just ended the post with all possible disclaimers. :)

Phil, I said statistical mechanics was bad at predicting the past, not thermodynamics; it gets teakettles wrong too.

Scent, we know how to calculate the future, so calculating the past would be just as easy if the same approach applied.

James, Scent is right about records.

mjgeddes, tim is right; high entropy states can be very simply indicated.

Even if the hypothesis of a low-entropy origin of the universe was proved false, this would not invalidate thermodynamics or the 2nd Law. So clearly it is incorrect to say that thermodynamics somehow "depends" on the Past Hypothesis or that it is somehow "scandalous" that the cosmological implications of the 2nd Law are not explained in textbooks.

Phil, there's a lot of handwaving going on, as you suggest, and all sorts of appeals to a false analogy. Let me quote:

I'm pointing out the difficulty of whatcounts as 'improbable'. For example, if you look at just the egg, and asample of dust and gas containing all the constituent atoms needed to makethe egg, say, a sample containing 10^30 atoms in a volume of 10^30 cm^3,it's very improbable that anything even remotely resembling an egg in anystate will form, and then still much, much more improbable that a perfect,whole egg will form. A Boltzmann's Egg vs. a Universal Egg if you will.But suppose that instead of 10^30 atoms, you have 10^60 atoms (in 10^60cm^3) to playwith. Naively, the formation of a Boltzmann egg is still vanishinglyimprobable, the Universal egg much more so. However, what really happensis that under the effects of gravitation this nebulae of gas and dustcondenses to form a sun with planets, life evolves, and so - practicallyin the blink of an eye as these time scales are reckoned - a perfect eggis formed. Under this scenario, the probability of forming a perfectunbroken egg is far, far higher than the probability of an imperfect,broken egg assembled by pure chance.So how much more improbable is the formation of a Universal egg as opposedto a Boltzmann's egg?That's the problem here. It's not at all clear how to count states toperform the notional summing, no one really has a handle on what entropymeans in a universe with certain types of gravity and certain geometries,or even for that matter on all the requisite basic physical laws. Withoutthis knowledge, no one knows how contingent certain events are in theseensembles verses other events. Are they as strongly contingent as theformation of water molecules in a box filled with oxygen and hydrogen? Asstrongly contingent as the evolution example given above? And so on andso forth. Indeed, most physicists say that we're nowhere near closeenough to having enough of a handle on these problems to make any sorts ofclaims about things like the relative probabilities of Boltzmann's Brainsforming (I suppose that if you want to talk about a very restricteduniverse, one that is open, completely flat, has no gravity, and can beassumed to have a 'statistically even' distribution of matter one might beable to make some sort of halfway plausible statement. That's not theuniverse we live in, however.)This is all well known inside the physics community, and for thesereasons, most researchers discount the Boltzmann Brain argument. Those whodon’t are considered outside of the mainstream. Note that no one istalking about whether or not the universe arose from a 'statisticalfluctuation' per se. Most people in fact would be unsurprised to find outthat our universe didn't start out the way it did because of one of theseproblematic events. It's just that you can't use this kind of argument(at least, not yet) to discount the hypothesis.I hope that clears things up.

@Paul: Did you read my last comment? Informational entropy is one thing, and thermodynamic entropy is another. It may be quite possible to specify a heat-death-like state highly concisely if you are using a Turing machine for the specification.

Couple of questions:<ul><li>When you say thermodynamics is bad at projecting into the past, you're only talking about the origin of the universe, right? You're not saying we get the wrong results when talking about teakettles.</li><li>When you say "the universe long ago had very low entropy", do you mean that the distribution of possible past states, conditioned on the present, is a nearly-flat distribution?</li></ul>It sounds to me like you're just saying that (it's harder to predict the past than to predict the future) = (entropy increases).

Or is the difficulty that we don't have priors on initial states? You might be saying that the set of starting states that we consider reasonable, is only a small subset of the set of possible starting states. There can only be a set of starting states that we consider reasonable, if we are secretly assigning priors to starting states. We don't enter these priors on start states into the equations, so the output doesn't reflect them.

We have many more high-entropy predecessor states than low-entropy because there are so many more high-entropy states.

This statement doesn't seem to make any sense, on the face of it. Perhaps a specific example would help. I get the impression that there's some confusion between what is commonly taught as thermodynamics, and to what system the laws of thermo are being applied to.

@Cyan: there's no contradiction between what I said and what Sean said. We have many more high-entropy predecessor states than low-entropy because there are so many more high-entropy states.

As I say, if your prior is the Solomonoff prior, there's a huge weighting in favour of the low-entropy predecessor state.

All he's saying, James, is that according to the calculations of some people, it is more likely for the book you are reading to have condensed out of intergalactic space through completely random fluctuations than it is to be the result of some sort of evolutionary process starting far back in time, say, the coalescing of a protostellar cloud into our present-day Sun and planets.

To say that these calculations are . . . questioned would be an understatement. The current consensus is that there is no way to make these sorts of calculations given the state of the art.

For example, we might think we know about the past via our memories and records, but this standard approach says our records are far more likely to result from random fluctuations than to actually be records of what we think they are.

Perhaps the vagueness in the past hypothesis is the problem.

Which seems to me to basically say that this anomaly is about as bad as it could possibly be.

Let us say instead that the anomaly is exactly as bad as it could possibly be: the large-scale entropy of the universe at the Big Bang is zero. This is a very strong claim, and (like all extreme statements) it's simple too. It's too strong, compared to the evidence, to justify stating it without qualification, but as a hypothesis I like it.

While I'm here, I'll recommend Huw Price's 1996 book Time’s Arrow and Archimedes’ Point on the arrow of time, although I don't agree with Price about everything.

And I agree that the silence about this, the idea that the second law is fully understood, is a scandal.

Paul, the anthropic principle explains the low-entropy of the origin. That explanation trumps Occam's razor. Even if low-thermodynamic states had long descriptions, we would still see them at the origin - since otherwise, observers would not have evolved.

Stuart, yes I recall Penrose's flat past hypothesis; is it still considered viable?

Hard to tell. It is very speculative, and Penrose is reaching the end of his career, and some people are clearly humouring him... But there is some weak evidence supporting it, and it does appear to be testable.

And the mathematics appear correct - in the absence of the Ricci tensor, the slow, infinite, cold end of the universe is indistinguishable from the fast burning expansion of the big bang...

@Tim Tyler: I follow that, but it doesn't matter: obviously the concisely-describable high entropy states are a tiny fraction of their total number, so it's *still* justified to assign a non-negligible prior to a low-entropy state if you're using Solomonoff induction.

You've got to be very careful how to set up the boundary conditions, even in a very simplified classical regime. For example, any collection of particles assembled in a finite volume of space, no matter how large, and allowed to expand outward for whatever length of time you care to specify into a vacuum of infinite space will be in a condition of 'low entropy' in comparison to later times. It's also not very difficult to define conditions so that the total energy of the universe is zero. Thus, a localized 'quantum fluctuation' in infinite space and time would under these conditions quite nicely start out in a condition of 'low' entropy and propagate forward in time into the infinite future in a very natural way.

I'm not endorsing this simplified model at all, btw. Just pointing out that reasoning by analogy to a more familiar situation can lead to apparently nonsensical results.

Robin:

Sean, the "past hypothesis" seems to me more of restatement of our failure than an actual solution. And I'd be more comfortable accepting it if I saw some formal calculations showing what it implied; the hand-waving makes me nervous.

The low-entropy-past hypothesis is neither a full solution nor a restatement of our failure; it is a partial explanation. It is like starting with the question "Why is this room hot?" and getting the answer "because there is a thermostat connected to a heating device, which produces heat until the room is brought to a certain temperature". Sure, this explanation prompts the question "why is the thermostat set so high?", but that doesn't mean the explanation is just a useless restatement of the question.

In comparing the low-entropy-past hypothesis to the fictional "life appeared even though it seems unlikely" law, you ignore the substantial predictive power of the former.

Robin, this is false, weakly so in the first case[1], strongly so in the second. Please read what I write.

[1]Unless you have, for example, the definitive explanation on whether the universe is open or closed, etc. You are confusing local, gravity-free cases with global, non-gravity free cases. Iow, don't extrapolate from a gas enclosed inside a jar.

It almost seems like people are iterating through all possible misunderstandings, suggesting I should have just ended the post with all possible disclaimers. :)

Phil, I said statistical mechanics was bad at predicting the past, not thermodynamics; it gets teakettles wrong too.

Scent, we know how to calculate the future, so calculating the past would be just as easy if the same approach applied.

James, Scent is right about records.

mjgeddes, tim is right; high entropy states can be very simply indicated.

Even if the hypothesis of a low-entropy origin of the universe was proved false, this would not invalidate thermodynamics or the 2nd Law. So clearly it is incorrect to say that thermodynamics somehow "depends" on the Past Hypothesis or that it is somehow "scandalous" that the cosmological implications of the 2nd Law are not explained in textbooks.

Phil, there's a lot of handwaving going on, as you suggest, and all sorts of appeals to a false analogy. Let me quote:

I'm pointing out the difficulty of whatcounts as 'improbable'. For example, if you look at just the egg, and asample of dust and gas containing all the constituent atoms needed to makethe egg, say, a sample containing 10^30 atoms in a volume of 10^30 cm^3,it's very improbable that anything even remotely resembling an egg in anystate will form, and then still much, much more improbable that a perfect,whole egg will form. A Boltzmann's Egg vs. a Universal Egg if you will.But suppose that instead of 10^30 atoms, you have 10^60 atoms (in 10^60cm^3) to playwith. Naively, the formation of a Boltzmann egg is still vanishinglyimprobable, the Universal egg much more so. However, what really happensis that under the effects of gravitation this nebulae of gas and dustcondenses to form a sun with planets, life evolves, and so - practicallyin the blink of an eye as these time scales are reckoned - a perfect eggis formed. Under this scenario, the probability of forming a perfectunbroken egg is far, far higher than the probability of an imperfect,broken egg assembled by pure chance.So how much more improbable is the formation of a Universal egg as opposedto a Boltzmann's egg?That's the problem here. It's not at all clear how to count states toperform the notional summing, no one really has a handle on what entropymeans in a universe with certain types of gravity and certain geometries,or even for that matter on all the requisite basic physical laws. Withoutthis knowledge, no one knows how contingent certain events are in theseensembles verses other events. Are they as strongly contingent as theformation of water molecules in a box filled with oxygen and hydrogen? Asstrongly contingent as the evolution example given above? And so on andso forth. Indeed, most physicists say that we're nowhere near closeenough to having enough of a handle on these problems to make any sorts ofclaims about things like the relative probabilities of Boltzmann's Brainsforming (I suppose that if you want to talk about a very restricteduniverse, one that is open, completely flat, has no gravity, and can beassumed to have a 'statistically even' distribution of matter one might beable to make some sort of halfway plausible statement. That's not theuniverse we live in, however.)This is all well known inside the physics community, and for thesereasons, most researchers discount the Boltzmann Brain argument. Those whodon’t are considered outside of the mainstream. Note that no one istalking about whether or not the universe arose from a 'statisticalfluctuation' per se. Most people in fact would be unsurprised to find outthat our universe didn't start out the way it did because of one of theseproblematic events. It's just that you can't use this kind of argument(at least, not yet) to discount the hypothesis.I hope that clears things up.

@Paul: Did you read my last comment? Informational entropy is one thing, and thermodynamic entropy is another. It may be quite possible to specify a heat-death-like state highly concisely if you are using a Turing machine for the specification.

Couple of questions:<ul><li>When you say thermodynamics is bad at projecting into the past, you're only talking about the origin of the universe, right? You're not saying we get the wrong results when talking about teakettles.</li><li>When you say "the universe long ago had very low entropy", do you mean that the distribution of possible past states, conditioned on the present, is a nearly-flat distribution?</li></ul>It sounds to me like you're just saying that (it's harder to predict the past than to predict the future) = (entropy increases).

Or is the difficulty that we don't have priors on initial states? You might be saying that the set of starting states that we consider reasonable, is only a small subset of the set of possible starting states. There can only be a set of starting states that we consider reasonable, if we are secretly assigning priors to starting states. We don't enter these priors on start states into the equations, so the output doesn't reflect them.

We have many more high-entropy predecessor states than low-entropy because there are so many more high-entropy states.

This statement doesn't seem to make any sense, on the face of it. Perhaps a specific example would help. I get the impression that there's some confusion between what is commonly taught as thermodynamics, and to what system the laws of thermo are being applied to.

@Cyan: there's no contradiction between what I said and what Sean said. We have many more high-entropy predecessor states than low-entropy because there are so many more high-entropy states.

As I say, if your prior is the Solomonoff prior, there's a huge weighting in favour of the low-entropy predecessor state.

All he's saying, James, is that according to the calculations of some people, it is more likely for the book you are reading to have condensed out of intergalactic space through completely random fluctuations than it is to be the result of some sort of evolutionary process starting far back in time, say, the coalescing of a protostellar cloud into our present-day Sun and planets.

To say that these calculations are . . . questioned would be an understatement. The current consensus is that there is no way to make these sorts of calculations given the state of the art.

For example, we might think we know about the past via our memories and records, but this standard approach says our records are far more likely to result from random fluctuations than to actually be records of what we think they are.

Even on rereading this makes no sense to me.