Toward the end of the TV series Game of Thrones, a big long (multi-year) winter was coming, and while everyone should have been saving up for it, they were instead spending lots to fight wars. Because when others spend on war, that forces you to spend on war, and then suffer a terrible winter. The long term future of the universe may be much like this, except that future winter will
The opening scenes in Star Wars VII (the first of the new ones) symbolized this very well. We see amazing futuristic technology, but everything is dwarfed by the enormous wrecks of battleships. We see people living in a time when vast resources are clearly available, but used for war, while they live on the edge of starvation.
I'm pretty sure I said something more specific than that.
"So the bottom line is that if war and theft remain possible for our descendants, the rate at which they do things will be much faster than the much slower most efficient speed....Now it is possible that there will be future resources that simply cannot be exploited quickly."
Sorry if this seems like a dumb question, but couldn't there also be a situation where the damage caused by theft and war are not as significant as they are now? This might seem paradoxical as we'd assume galactic sized civilizations will have galactic sized wars, but what if, for instance war became a virtual activity or theft was almost entirely digital? These might stimulate the same barbaric forces in our nature but with a smaller physical footprint.
It's just an idea that popped into my head while reading this...sort of a "maybe the other side of the equation can change too" thing. I suppose an easy counter is that if there's booty to be stolen there will always be pirates for killing and stealing.
A lot of this reminds me of Asimov's Foundation and his examples of the splintered, highly configurable and tribal nature of humanity aligns with your vision. Foundation was the original 'Winter is Coming' story, aside from that one about grasshoppers and ants, of course.
I mean sure, but from that perspective doesn't your post reduce to "life in the universe inevitably faces scarcity, rest see above"?
Inefficient equilibria in the context of scarcity is the generic problem of all existence. Might as well say "bad stuff".
It seems to me to be about the exact same problem - resource scarcity gives the advantage to exploitive strategies against which we are helpless due to inability to globally coordinate, with our current time of relative plenty revealed as an unusual and transitory divergence from the norm of ruthless competition and defection traps that only exists because through technological progress, production growth has temporarily outstripped our capacity to maximally exploit it for local competetive advantage.
That post is so vague re the problems it fears, I'm not really sure that is the same topic.
Just for reference, I'm sure you've read it: this topic is also discussed in Scott Alexander's seminal Meditations on Moloch - https://slatestarcodex.com/...
The solution he proposes is a superintelligent morally benevolent singleton. Personally, I'm not sure how survival is at all possible with anything less.
I very much disagree that assuming the continued validity of thermodynamics is "wild speculation".
I didn't recommend or plan a grand strategy.
I thought the way it worked was the following: The number of bit-flips 1J of energy lets you do is proportional to exp((1J)/kT). As T->0, you can actually make computing infinitely energy efficient. Maybe all the cool civilizations are hanging out far far away from our raging noise-inferno stars. (edit: also those are *irreversible* operation bit-flips. You actually get reversible operations for free. Irreversible operations are needed to deal with external input/output, but internal state?...) PS: No computer we've built is anywhere close in OOM to these fundamental limits, but if we're talking about infinite refinement over geological time.
Isn't it a little silly to be planning grand strategy for our great^10000 grandchildren, when our great^10000 grandparents didn't even have fire? Would you follow a life-plan written by someone in bronze-age (wherever-your-ancestors-came-from)? (edit: great^10000, human generations keep being shorter than I expect - pick a large integer - you're talking about a pretty large one when you're talking about the fate of galaxies and all-energy-in-the-cosmos)
Thought experiments are well and good, but it's hard to take it too seriously. The unspoken assumption is that there is nothing new to learn about the universe, and that the rules and objectives of the game of life are closed. (And if that were true, what are your descendants going to be *doing* with all that computation?)
1) The fundamental laws of physics are time symmetric. The second law of thermodynamics actually isn't one of them.2) Entropy is *subjective*. Many fundamental postulates of statistical mechanics we actually know not to be quite so. This might have no interesting consequences, or profound consequences depending on what we learn over billions of years.3) Worrying about the heat death of the universe as a mortal human is about as pointless as a squirrel worrying about global uranium reserves. Just because you can wildly extrapolate what you think you know, and just because there could be civilizations to which things like global uranium reserves matter, doesn't mean it's relevant to the problems that actually matter in the squirrel's life.
Our descendants may not have the same opportunities to solve this problem that we have. At a minimum it seems much easier (even if still very hard in an absolute sense) to solve this problem (for example by coordinating to build a Singleton / strong world government) before human or post-human civilization starts spreading into the stars. It seems like lack of urgency is only justifiable if one was very certain that space colonization is far in the future, and I don't see how that belief is justifiable.
3-4 decades ago Freeman Dyson wrote two papers about life in alternative distant futures. One concerned an (unaccelerated) expansion / heat death future, the other a Big Crunch. In each, he sketched a means by which intelligence could think an unbounded number of thoughts (running ever slower during the expansion, ever faster in the Big Crunch).
Apologies for the lack of citations and this possibly-skewed summary. And at least one of these papers has been overtaken by events (discovery of the accelerated expansion). Still worth a mention I think.