"The surface area of a city, where cooling fluids can go in and out, goes as the square of city scale , while the volume to be cooled goes as the cube of city scale."
Wouldn't the volume to be cooled be proportional to the surface (so ^2), and the "in and out" to perimeter (so linear)?
Unless you picture pipes coming in and out from the ground or top, and the height of cities growing with scale for some reason.
ISWYM. I intended to refer to reducing local heat generation by keeping the entropy generated by computations in the digital domain - in order to pipe it around more effectively. Converting conventional heat into digital heat doesn't seem likely to be very useful.
For example, by using the piezoelectric effect to translate between random molecular collisions and digital radiation bursts. Any radiation of signals will reduce the temperature of the radiating body. Hot bodies spit out digital signals - in the form of photons and other particles -all the time, and they are cooler as a result of their emissions. It is no big deal.
No; you brought it up by saying you could digitize heat. Talking about reversible computation at all implies knowing digital entropy must produce physical heat. I thought you were saying we could transport heat by digitizing it.
Only you are talking about converting physical heat into digital heat. Everyone else is discussing going in the opposite direction - converting digital heat into physical heat.
Of course, in principle, you can do the conversion you specify. Generating disordered data from hot objects can cool them down a little. Heat is just disordered motion - which is a substrate-neutral concept. Heat is portable.
Heat is high levels of entropy. Digital heat is high levels of digital entropy - i.e. a disordered / incompressible string of bits. If you erased those bits, you would convert digital heat into conventional heat - along the lines suggested by Landauer's principle.
Interesting. Relies on the existence of "time crystals", on which see http://phys.org/news/2013-0... . Also, Google shows zero references in the world to "heat superconductivity" other than this paper.
Not a permanent game-changer, as continued operation of economic forces should rapidly result in converting the Earth to computronium under the tech level we're talking about. So what really matters is how to radiate heat away from the Earth.
Memories appear at most as crisp as HD video (of which Blu-ray is already a very inefficient form: an mp4 file can deliver the same noticeable quality in a much smaller file), but memories do contain elements not found in video such as smell, touch and emotions, however there is no reason to assume those take up a lot of space (it's not hard to imagine memories merely contain index references to tables of smells, touch and emotions). HD resolution at, say 48fps, with two video and audio streams (for both eyes and ears), is enough to make something look real to us. Then it's a matter of what form of compression the human mind uses, all in all memories indeed don't have to be computationally expensive.
The "kernel" of core memories that include skills, personality traits and possibly tables of emotions, smell, touch cannot be separated from the computations of the mind so easily but then again this form of memory doesn't have to be spacious at all, maybe only a day worth of conscious memories, maybe much less even (as in a video game or operating system where the engine/kernel is dwarfed in size by the textures/apps)
A Blu-ray Disc video uses data rates up to 54 Mbit per second. Since our memories aren't high-definition movies, that's probably an overestimation of the data rate that our brain uses to store its memories.
Storing at that data rate on hardware operating at Landauer's limit at room temperature requires dissipating about 0.15 picowatt of heat.
A human brain (the most energy-efficient type of computing hardware currently known) has an energy consumption of 20-40 watts. That's nowhere near thermodynamical limits.
Most of the brain's operations ARE unconscious (depending on which metaphysical theory you go with most of these operations are either really unconscious or consciously experienced by subdivisions of the brain, not by "you").
Btw, why does this even matter? Writing and accessing memories isn't that computationally extensive (and will thus produce relatively small amounts of heat), so why can't memory be done by a non-reversible computer with all the complicated calculations of intuition, emotion and empathy being done by a reversible computer?
"The surface area of a city, where cooling fluids can go in and out, goes as the square of city scale , while the volume to be cooled goes as the cube of city scale."
Wouldn't the volume to be cooled be proportional to the surface (so ^2), and the "in and out" to perimeter (so linear)?
Unless you picture pipes coming in and out from the ground or top, and the height of cities growing with scale for some reason.
According to Terry Gilliam's movie "Brazil," the future is ducts, lots and lots of ducts.
So no matter if we are talking networks or cooling, the future is a series of tubes?
ISWYM. I intended to refer to reducing local heat generation by keeping the entropy generated by computations in the digital domain - in order to pipe it around more effectively. Converting conventional heat into digital heat doesn't seem likely to be very useful.
For example, by using the piezoelectric effect to translate between random molecular collisions and digital radiation bursts. Any radiation of signals will reduce the temperature of the radiating body. Hot bodies spit out digital signals - in the form of photons and other particles -all the time, and they are cooler as a result of their emissions. It is no big deal.
You are assuming perfect parallelizability. What about Amdahl's law? http://en.wikipedia.org/wik...
No; you brought it up by saying you could digitize heat. Talking about reversible computation at all implies knowing digital entropy must produce physical heat. I thought you were saying we could transport heat by digitizing it.
That's going in the opposite direction. But how do you digitize physical heat (end up with lower temperature and random bits)?
Only you are talking about converting physical heat into digital heat. Everyone else is discussing going in the opposite direction - converting digital heat into physical heat.
Of course, in principle, you can do the conversion you specify. Generating disordered data from hot objects can cool them down a little. Heat is just disordered motion - which is a substrate-neutral concept. Heat is portable.
Heat is high levels of entropy. Digital heat is high levels of digital entropy - i.e. a disordered / incompressible string of bits. If you erased those bits, you would convert digital heat into conventional heat - along the lines suggested by Landauer's principle.
Interesting. Relies on the existence of "time crystals", on which see http://phys.org/news/2013-0... . Also, Google shows zero references in the world to "heat superconductivity" other than this paper.
Not a permanent game-changer, as continued operation of economic forces should rapidly result in converting the Earth to computronium under the tech level we're talking about. So what really matters is how to radiate heat away from the Earth.
Memories appear at most as crisp as HD video (of which Blu-ray is already a very inefficient form: an mp4 file can deliver the same noticeable quality in a much smaller file), but memories do contain elements not found in video such as smell, touch and emotions, however there is no reason to assume those take up a lot of space (it's not hard to imagine memories merely contain index references to tables of smells, touch and emotions). HD resolution at, say 48fps, with two video and audio streams (for both eyes and ears), is enough to make something look real to us. Then it's a matter of what form of compression the human mind uses, all in all memories indeed don't have to be computationally expensive.
The "kernel" of core memories that include skills, personality traits and possibly tables of emotions, smell, touch cannot be separated from the computations of the mind so easily but then again this form of memory doesn't have to be spacious at all, maybe only a day worth of conscious memories, maybe much less even (as in a video game or operating system where the engine/kernel is dwarfed in size by the textures/apps)
A Blu-ray Disc video uses data rates up to 54 Mbit per second. Since our memories aren't high-definition movies, that's probably an overestimation of the data rate that our brain uses to store its memories.
Storing at that data rate on hardware operating at Landauer's limit at room temperature requires dissipating about 0.15 picowatt of heat.
A human brain (the most energy-efficient type of computing hardware currently known) has an energy consumption of 20-40 watts. That's nowhere near thermodynamical limits.
Most of the brain's operations ARE unconscious (depending on which metaphysical theory you go with most of these operations are either really unconscious or consciously experienced by subdivisions of the brain, not by "you").
Btw, why does this even matter? Writing and accessing memories isn't that computationally extensive (and will thus produce relatively small amounts of heat), so why can't memory be done by a non-reversible computer with all the complicated calculations of intuition, emotion and empathy being done by a reversible computer?
If they are not physically reversible, they will also produce heat locally.
It is trivial to create a logically reversible gate (e.g. Toffoli or Fredkin) from common logically irreversible gates, but it would serve no purpose.
That isn't converting physical heat into digital heat. How can I cool my refrigerator and pipe the heat out a fiber optics cable?