From Eternity To Here

Out today, Sean Carroll’s new book, From Eternity to Here, is excellent.  After reading a draft in March, I wrote:

We are far from understanding thermodynamics. … The distributions we would usually use to successfully predict [physical system] futures are completely, totally, and almost maximally WRONG for predicting their pasts!  …  Worse, this “past hypothesis” is ambiguous in several ways … Only a tiny handful of physicists (and philosophers) are trying to explain this past hypothesis; … no one is even remotely close.

Here is Carroll’s proposed solution scenario:

  1. Physics is always exactly locally time-reversible.
  2. Each small region of space has bounded entropy, yet an infinite state space.
  3. So entropy has no upper bound, so systems are never in full equilibrium.
  4. Our local universe is expanding with a weak dark energy.
  5. Our distant future is a forever expanding emptiness at 10-29K.
  6. Very rarely, local fluctuations there build brains like ours.
  7. Far more rarely, local fluctuations pop a tiny new universe.
  8. Tiny new universes are very curved and thus very dense.
  9. Dense regions generically expand to get less dense.
  10. In some dense expanding regions, a dark energy starts eternal inflation.
  11. Inflation makes flat uniform local universes with scale-less fluctuations.
  12. Local universes sit in different local minima with different local physics.
  13. In some, scale-less fluctuations make galaxies etc. and brains like ours.
  14. Those local universes also get empty, then rarely pop tiny new universes.
  15. On average each tiny new universe gives rise later to several more.
  16. So there are an infinite number of local universes.
  17. A region in our past pops tiny universes in both time directions.
  18. There are overall far more brains like ours than fluctuation brains.

Many of these are far-from-proven conjectures, but still it does all hold together. Locally infinite state spaces (#2), might appear to conflict with the holographic principle:

There is a maximum amount of entropy you can possibly fit into a region of some fixed size, which is achieved by a black hole of that size.

But it doesn’t conflict; region size is neither constant nor bounded.  Even so, it is very hard to over-emphasize just how far one must project current physics beyond the accuracy with which we have verified it to talk about tiny new universes popping out of quantum fluctuations in empty space at 10-29K.  It will be truly incredible if we get that right.

On style, I’m again struck by how different is the public’s preferred style for popular physics vs. economics books.  Popular physics books, like Carroll’s, act easy and friendly, but still lecture from on high, sprinkled with reverent stories on the “human side” of the physics Gods who walk among us.  They grasp for analogies to let mortals glimpse a shadow of the glory only physicists can see directly.

The recent popular econ book Superfreakanomics is also excellent, but very different in tone.  Also easy and friendly, this is full of concrete stories about particular data patterns and what lessons you might draw from them, or you might not; hey it is always up to you the reader to judge.  Such books avoid asking readers to believe anything abstract or counter-intuitive based on the author’s authority.

The main difference, I think, is that readers don’t fundamentally care about physics, so can’t get worked up disagreeing with physics authors.  They read to affiliate with great men, and to lord their greater knowledge over lessor associates.  In contrast, people actually care about many economics topics, and our democratic culture, where everyone’s political opinions are officially equally valued, simply can’t accept opaque expertise on such things.

GD Star Rating
loading...
Tagged as: ,
Trackback URL:
  • http://timtyler.org/ Tim Tyler

    I don’t see any evidence at all for locally infinite state spaces. Rather there’s the Bekenstein bound. Why hypothesise infinties without enormous quantities of supporting evidence? Using infinities just signals to everyone else that your theory is broken.

    • http://hanson.gmu.edu Robin Hanson

      It seems hard to escape such an infinity if local tunneling to create a baby universe is possible. In that case the state space has to include all possible states of that baby universe, including states of all its descendant baby universes.

      • Tim Tyler

        That’s a huge “if” – one not supported by any evidence I have heard of.

        IMO, you would need *good* evidence to hypothesise infinities in physics, not a load of ungrounded speculation.

  • mjgeddes

    Nope. His explanation is hopelessly contrived (too many steps!). I present an alternative answer:

    (1) There’s a global boundary condition placed on space-time, such that choosing different scales for coarse graining does not significantly affect an observer’s measured quantities. i.e., fractal condition: universe on a large-scale looks like universe on a small scale.

    (2) The boundary condition defines a global (absolute) complexity measure for informational entropy

    (2) Weight of subjective experience is correlated with informational entropy.

    (3) Thermodynamic entropy is a ‘spin-off’ (secondary consequence) of informational entropy.

    (4) Since (2) and (3) have a common cause (both weight of subjective experience and entropy depend on the boundary condition) and (1) implies an absolute complexity measure, this causes a time asymmetry (subjective experience is universally correlated to thermodynamic entropy and thus all conscious observers must perceive low entropy states as being in past)

    • Tim Tyler

      That’s an explanation for why we remember the past, and not the future:

      “Our ability to remember past but not future also coincides with the arrow of entropy. The reason, Hawking says, is that whenever a memory is made, in either a brain or a computer, the smidgen of energy required to light up a neutron or move an electron is released as heat. Heat -roiling, chaotic heat – increases entropy. Memories, then, because they release heat, increases disorder, too. Entropy increases from yesterday to tomorrow. That`s why memories are made in the past.”

      My understanding is that this post is about another issue. We do, in point of fact, have a low entropy past (the big bang) – and the anthropic principle dictates that evolved creatures will look back on low-entropy beginnings.

  • Stuart Armstrong

    The main difference, I think, is that readers don’t fundamentally care about physics, so can’t get worked up disagreeing with physics authors. They read to affiliate with great men, and to lord their greater knowledge over lessor associates. In contrast, people actually care about many economics topics, and our democratic culture, where everyone’s political opinions are officially equally valued, simply can’t accept opaque expertize on such things.

    The real difference is that physics has a much better track record in its predictions than economics; moreover, people do have intuitions about economics, but never about electrons and waves.

    • Millian

      This is correct;
      but it is also correct, as Robin says, that people have stronger intuitions about economics and become uncomfortable when told that they are untrue (for instance, on free trade, where predictions have been strongly consistent with reality);
      and it is also correct that the field of economics is less settled than physics, and its predictions depend on assumptions about psychology, which is itself not settled;
      and it is finally correct that human behaviour is more difficult to predict than physical behaviour, so some questions cannot be settled and predictions must be revised to account for new information, which is itself a large limiting factor on economic sciences (e.g. the release of the Pew World Tables enabled a lot of empirical work that was previously impossible).

  • ECM

    I enjoy reading these books, but they are, generally, science fiction w/o the interesting characters.

  • Stuart Armstrong

    17. A region in our past pops tiny universes in both time directions.

    Does this mean “there exists a point of low entropy in the past”, and hence (statistically), the second law of thermodynamics must be true after this point, and reversed before it, as entropy is approximately continuous?

    If this is the central insight, it may remain true even if some of the points above it are wrong…

  • salacious

    Can someone answer a question for me? I’m trying to check my understanding of the arrow of time problem:

    All physics is microscopically time reversible. You time evolve a system from state A to state B, where the entropy of state B is higher than the entropy of state A. You then set up a similar system which begins in state B*, which is identical to state B, except with all the particles have their velocity vectors reversed. If you then time evolve this second system, it will look like the first system running in reverse–going from state B to state A–from high entropy to low entropy. (This ignores CPT stuff, which I don’t entirely understand.) For example, you drop an egg on the floor. If you reverse all the vectors of the egg-floor system, it should evolve in such a way to shoot the egg up in the air and reassemble it.

    Furthermore, both state B and the reversed state B* are thermodynamically indistinguishable. They are both microstates which correspond to the same thermodynamic macrostate. Theoretically, if we know the system is in that thermodynamically macrostate, the chance that the system is in microstate B should be equal t the chance it is in microstate B*. When the system time evolves, it should be equally likely that the system evolves from B to some new state C, increasing entropy, and from B* to A, decreasing entropy. But we never see the latter. Systems are almost always in state B, where time evolution increases entropy–this is the second law of thermodynamics. We never see the egg bounce off the floor and reassemble itself.

    Is this an accurate explanation of the arrow of time problem? Am I missing something important?

    • Tyrrell McAllister

      Microstates with reversed velocity vectors needn’t be in the same macrostate.

    • Tim Tyler

      There is no “arrow of time problem”. Those who think there is are just in a muddle about the issue. The reason the psychological arrow of time and the thermodynamic arrow of time are linked is rather well understood. The explanation is even in A Brief History of Time, IIRC – though Hawking was pretty confused about the issue back then, and got other bits of it all wrong.

      • Tyrrell McAllister

        There is no “arrow of time problem”. Those who think there is are just in a muddle about the issue. The reason the psychological arrow of time and the thermodynamic arrow of time are linked is rather well understood.

        But that’s not the really hard part of the arrow-of-time problem. Given a thermodynamic arrow, you can probably derive a psychological arrow. (This 12.5 minute BloggingHeads segment makes the case that it’s not as straightforward as many think, though.) The hard problem is closer to what salacious described—it’s showing why there’s a thermodynamic arrow in the first place.

      • http://timtyler.org/ Tim Tyler

        Re: “The hard problem is closer to what salacious described – it’s showing why there’s a thermodynamic arrow in the first place.”

        That seems pretty trivial to me: you get analogs of the second law in practically any reversible cellular automaton that exhibits complex behaviour.

      • Tyrrell McAllister

        That seems pretty trivial to me: you get analogs of the second law in practically any reversible cellular automaton that exhibits complex behaviour.

        The issue isn’t the 2nd Law. It’s making the connection to an arrow of time. For every trajectory through state-space moving from high entropy to low entropy, there’s a trajectory moving from low entropy to high entropy. Why is ours one of the former rather than the latter? Or, rather, what justifies our belief that it is?

        Given a state on a trajectory, the 2nd law says that its future is very likely to be high entropy. But the 2nd law also says that its past is very likely to be high entropy. For some reason, we think that there is countervailing evidence showing that our past was actually low entropy. But we also think that this evidence doesn’t imply that the future is low entropy.

        What justifies this asymmetry? That’s the problem of the thermodynamic arrow of time.

      • Tyrrell McAllister

        For every trajectory through state-space moving from high entropy to low entropy, there’s a trajectory moving from low entropy to high entropy. Why is ours one of the former rather than the latter?

        Ignore this point; it is incoherent. We just define the direction of time on a trajectory to point in the direction of lower entropy. My second point above is the real one. By the 2nd Law, we would expect the present to be a local maximum with respect to both “temporal sides”. That is, we would expect both temporal directions on our trajectory to be “thermodynamic futures”.

      • mjgeddes

        Tim,

        There is an arrow of time problem regarding why thermodynamics predicts future evolution but not past evolution, as Hanson and Tyrrell both correctly explained.

        I’m sticking to my guns and saying its got something do with Occam’s razor and complexity. Hanson and you both rebutted me in the earlier thread by stating that Occam can’t be the solution because high entropy states can be very simply spcified, but I don’t buy it. It all depends on exactly how your complexity meausre is defined.

        It’s Occam I’m telling you all. Universal priors are the answer. Of course, pinning down exactly why and how is where I’m running into a spot of bother ;)

    • Michael

      As far as I understand it, the reversal of entropy would eliminate any physical evidence that it existed, whilst simultaneously – via a reduction of information between the entropic system, and any correlated systems – wiping the memory of any observer. Therefore if the universe is a closed system, time may only be perceived as moving forwards.

  • David Mazzotta

    Robin, do you believe humans have an existential desire for knowledge of the world — call it intellectual curiosity — or do you believe pursuit of knowledge to be utilitarian, either for status or practical applications?

    This is the line makes me think you believe the latter: They read to affiliate with great men, and to lord their greater knowledge over lessor associates.

    I try to stay current in physics, but I cannot recall have a single conversation with anyone about it, nor have I benefited in any tangible way that I see. I have always assumed myself to be driven by an innate intellectual curiosity, like an innate love of music, but I have been reading this blog far too long to trust my own perceptions of my motivations anymore.

    • http://knackeredhack.com knackeredhack

      David,

      I’ve had a similar problem. Since reading Robin’s blog, I became concerned that my motives for maintaining a garden wormery were nothing more than signalling greater environmental concern than my “lessor associates.” To overcome this possible bias, I vowed I’d never reveal the fact that I was composting my domestic waste in this way to ensure in my own mind there was no doubt about the purity of my motives.

      Oh, sh%t. Now look what I’ve gone and done.

      Tim

  • Matt

    Robin you are spot on as to why I read physics, but that is also why I read economics. I think one of the main reasons people get so worked up over economics isn’t just that they care more about it but also that they are exposed to more controversial information when reading at a popular level. Disagreement in the field of physics is mostly outside popular understanding but disagreement in economics is over popular level stuff.

  • jonathan

    Why do physics book get written as though from on high? Because the writers are trying to convince themselves, because all they have to rely on is belief generated out of incomplete facts and models that don’t completely fit together or which, at minimum, rely on assumptions about evidence that hasn’t actually been discovered. Because they’re trying to marshall support for their perspective because prestige in the fields is rooted in creativity AND acceptance. And all this is because we have no understanding of some basic why’s. It sometimes frightens me to realize how many fundamental absences we paper over.

    • Tyrrell McAllister

      Even if true, that wouldn’t explain the difference between physics and economics.

  • Constant

    Thinking Physics, by Lewis Carroll Epstein, which I read either in high school or in junior high, does not seem to fit the profile that Robin described. It remains my favorite popular physics book, and I read it, well, a long time ago. It’s still in print. My favorite popular economics books, on the other hand, do fit the profile.

    • Constant

      To clarify, I read it while I was in junior high or high. Not a school book, but one I found and purchased at Wordsworth in Harvard Square, or possibly at the New England Mobile Book Fair. So: a popular book.

  • mjgeddes

    Possible super-humanly intelligent breakthrough on the problem of gravity and the arrow of time!

    ‘The Entropy Force’ (New Scientist)

    The Entropy Force

    Verlinde’s work offers an alternative way of looking at the problem. “I am convinced now, gravity is a phenomenon emerging from the fundamental properties of space and time,” he says.

    To understand what Verlinde is proposing, consider the concept of fluidity in water. Individual molecules have no fluidity, but collectively they do. Similarly, the force of gravity is not something ingrained in matter itself. It is an extra physical effect, emerging from the interplay of mass, time and space, says Verlinde. His idea of gravity as an “entropic force” is based on these first principles of thermodynamics – but works within an exotic description of space-time called holography.

  • http://timtyler.org/ Tim Tyler

    Sean on video: “The Origin of the Universe and the Arrow of Time”

    http://www.youtube.com/watch?v=GFMfW1jY1xE

    Less crazy than I thought – but he doesn’t really present or consider alternatives to the “baby universe” hypothesis – which seems to make little sense considering what wild speculation it is.

  • Pingback: Overcoming Bias : Are Gardens Fertile?

  • Pingback: Overcoming Bias : Tegmark’s Vast Math