Hacker News new | past | comments | ask | show | jobs | submit login
Absolute Hot (wikipedia.org)
155 points by johnny313 on Aug 14, 2018 | hide | past | favorite | 71 comments



Unfortunately, this a terrible name, concept, and even article.

The vast majority of it is directly from a pop-science NOVA episode and not actually well-backed.

There are reasonable bounds you can place on energy density where we expect current physical theories to stop making sense.

But energy density is not the same as temperature. It is true that for things like ideal gases, temperature is roughly "energy per degree-of-freedom", which is an energy density of sorts, but that's not fundamentally what temperature is.

Temperature is nothing more than a specific measure of how energy will flow due to entropic effects. In the right systems, this can be arbitrarily high without a high energy density. (In fact, elsewhere on this very post, people have pointed out "negative temperatures" where the temperatures become "hotter than infinity", they "wrap around" to negative.)


If the temperature wraps around to negative, doesn't that imply a maximum value at the wrap point? Do we know the temperature value at which it wraps to negative? Is it literally "infinity Kelvin?"


Yes, the wrap point is literally infinity, wrapping to negative infinity.

https://en.wikipedia.org/wiki/Negative_temperature is not terrible, for an overview, though the disclaimer is just annoying at this level.

For thinking about this point, it's much easier to talk about "thermodynamic beta" (sometimes called "coolness" or "coldness") which is just 1/T = partial S/partial E. The behavior of a spin system that admits negative temperatures can be described smoothly in terms of beta -- hotter systems have beta that is lower, and zero is not particularly special.

Now, any real system is coupled to the rest of the environment, so can't be in equilibrium at a negative temperature, as it would continuously leak heat until it cooled down enough to have some positive temperature. But if its internal equilibration proceeds much faster, then it's still useful to talk about its temperature as a quasi-equilibrium case.


Thanks!


Question: is the analogy of thermal energy as particles flying around and bouncing into each other just analogy? At what temperature would the particles fly at the speed of light?

> Above about 10^32K, particle energies become so large that gravitational forces between them would become as strong as other fundamental forces according to current theories.

I see, the gravitation would become a problem even before the speed.


You can't accelerate anything to the speed of light without an infinite amount of energy. It just takes more and more energy to get closer to that speed. https://en.wikipedia.org/wiki/Speed_of_light#/media/File:Lor...


> You can't accelerate anything to the speed of light

You can't accelerate anything _with mass_ to the speed of light. Although I guess that stuff with no mass already travels at the speed of light, so you wouldn't need to accelerate it.


Precisely. Photons always move at c.


Does that mean it doesn't take more energy to accelerate an electron to the speed of light than a tennis ball?


No hadron can achieve the speed of light. See:

Ek=mc^2/√(1−(v/c)^2)−mc^2

For v<<c you have significant discrepancies on the energy required to accelerate to a specific speed with regard to mass. I.E. Newton: Ek=(mv^2)/2

As v->c it does not matter as much, the lorentz factor is much more significant, the mass operates just as a base multiplier and sum factor.

As v->c, x->0 where Ek~1/x, i.e. tending to infinity with a division by zero when v=c.

In conclusion, the speed is the relevant factor instead of mass when near speed of light, regardless of the object being an electron or the mount Everest.

Of course, assuming the equation holds ;-)


Another perspective is that the object's effective mass is going exponential as v->c. I mean, that's why we say "rest mass", right?


Since E=mc^2 -> m=E/c^2.

For a moving object you could then m=(Er+Ek)/c^2, which creates the impression that the mass is variable (as the term Ek is zero when at rest and increasing with velocity), giving rise to the terms 'rest mass' and 'relativistic mass' respectively for the rest energy and total energy equations.

This interpretation is somewhat outdated but the terminology rest mass maintains its legacy. One could refer to it as the `(proper |invariant |intrinsic )?mass` instead.

The variable mass issue is then 'solved' by 'refactoring' the equation to use momentum where mass is coupled with velocity, over which the complexity of the lorentz factor is engulfed.


So how does 'invisible' kinetic energy (say, that of a ball traveling on the planet) change its mass?

If we cancelled out that movement (relative to the Milky Way), does the mass change?

Is how much do the various relative movements affect our mass, and would it be possible to pull tem apart? (Solar System, Relative to galactic center, other galaxies, etc)


I am not sure, I am not a physicist.

What I understand is that the contemporary conception is that the mass does not change.

I just presented the argument for a notion of rest mass and relative mass.


Thanks!


It takes more energy to accelerate a tennis ball to 80% of the speed of light (0.8c) than it takes to accelerate an electron to 80% the speed of light. It takes more energy to accelerate a tennis ball to 0.999c than an electron to 0.999c.

But the faster and faster you go, the more energy is required. As the number approaches 100%, the energy required for each tiny fractional step goes up exponentially, so much so that it is impossible to accelerate either object to 100% of the speed of light. It just requires more, and more, and more energy.


In the sense that if you multiple infinity by an arbitrarily large finite number you still have infinity, yes.


Neither of these things is possible, so comparison is meaningless.


> At what temperature would the particles fly at the speed of light?

Never. More precisely, never for particles of ordinary matter that have nonzero rest mass. Relativistic effects change the dependence of temperature on velocity (more precisely, the dependence of kinetic energy on velocity), so that kinetic energy/temperature increases without bound as the speed of light is approached.

For a "gas" of photons, particles of light, the particles always move at the speed of light, because they have zero rest mass. But photons can have any kinetic energy, so a photon gas can have any finite temperature.


Right, this was how I was taught about temperature, and I'm just realizing now it probably isn't the best analogy... Like, do particles shot through a particle accelerator have a super high temperature? They're moving awfully fast! Or does it have to be "vibration," in which case, "vibrating" relative to what?


It's technically not 100% defined. See my comment below for more details about degrees of freedom and energy.

Suppose you have a system with 100 degrees of freedom and 2 units of energy, spread out as (0.01, 0.01, ..., 0.01, 1.01). A bunch of its energy is in one of those hundred degrees of freedom. You can assign it two different temperatures: the temperature 0.01, which would describe how energy will right now flow into the system if you connect it to another system with a bunch of degrees of freedom with their own thermal energy (assuming that the 1.01 degree of freedom is "internal" and doesn't interact directly with the outside world), and the temperature 0.02, which would describe how energy will eventually be spread out and hence how it would eventually share freedom with the outside world.

Temperature is ultimately defined in terms of how our uncertainty about the microscopic state a system is in changes as we add energy to that system. The higher this rate of change of uncertainty, the lower the temperature is -- this is why when you connect two systems of different temperatures, in the process of us becoming more uncertain about the fundamental state of the world, energy "spontaneously" flows from the higher temperature to the lower temperature: the certainty gained from stealing energy from the higher-T one is more than compensated by uncertainty created from pouring that same energy into the lower-T one. (In fact there is a family of systems of "negative temperature" which become less uncertain as you add more energy to them: they are "hotter than the hottest possible temperature" because they will gladly give their energy to any "normal" system in the process of us becoming more uncertain about the world.)

The problem is that if we're certain that some degree of freedom has a given amount of energy that's "special", we have a bunch of different definitions of "temperature" depending on how "adding energy to the system" distributes between the "special" degree of freedom and the "thermal" degrees of freedom.

So the usual process is to just totally separate those degrees of freedom as separate systems, the "thermal" ones have a temperature, the "special" ones do not.


> there is a family of systems of "negative temperature" which become less uncertain as you add more energy to them . . .

I'm no physicist, just a chemist. What are they?


I mean it's not just one system, but the idea is what I just said.

The classical example is if you have a bunch of magnetic moments in a magnetic field and they do not interact with each other: then stuffing energy into the system requires aligning them against the magnetic field, and this makes the state more ordered. The problem is that these moments are generally in thermal contact with some apparatus that keeps them in place or vibrational degrees of freedom of their centers of mass or so. But you can get this thing to happen in magnetic resonance setups.

Negative temperature states pop up in a lot of strange places, the two that I know more closely are that lasing has this property of "as I dump more energy into the system I get more bosons in the lasing state" and Onsager in 1949 published a little article called “Statistical Hydrodynamics” which sort of went viral for the time, it points out that there is a way to view the instability of turbulent systems as due to negative temperature regimes of the vortices in those systems.


no it can be just relative to the average ke of the particles. Like in ideal gas model, its just single particles bouncing around in a box and temp is relative to the average ke of the particles. When you start introducing diatomic and above particles, then you have to worry about vibration (of the bonded particles) as well as rotations as other degrees of freedom for it to store internal energy.


It's not just an analogy, and it contains the essence of the story, but it's also not the whole story.

In physics we talk about the "degrees of freedom" of a system -- this is just the count of all of the independent ways that it can move. For each degree of freedom of a system you can calculate the average energy in that degree of freedom. By the equipartition theorem, at thermal equilibrium, all the degrees of freedom will have the same average energy, which will be T (if you measure temperature in units of energy).

So if you think about dropping a bouncy ball in a tube and it bounces until it slowly comes to rest, it has these degrees of freedom -- the internal degrees of freedom of the atoms of the ball, the internal degrees of freedom of the atoms of the floor/tube -- and then two really obvious degrees of freedom, the center-of-mass position of the ball, which gains an energy scale due to the gravitational force, and the center-of-mass momentum of the ball, which trades energy with this position degree-of-freedom.

Statistical mechanics says that as this system progresses, the location of the energy will slowly become more uncertain until it is on-average-evenly distributed across all of the degrees of freedom. That's why it bounces lower and lower: there is so much energy in the two "main" degrees of freedom -- maybe half a joule? -- whereas in the vibrations there is something closer to 10^-21 J of energy at room temperature.

But the flip side of dissipation is always fluctuation -- this is in fact the subject of a major theorem! So the fact that this can randomly lose energy to these other degrees of freedom means that those degrees of freedom are also randomly kicking the ball. As you can imagine with ~20 orders of magnitude difference between the two, they don't kick this ball by all that much. But you have a lot of experience with a lot more tiny balls that are bouncing off the ground all the time. Take a deep breath. There they are.

If everything were to come to its minimum energy configuration, why are these air molecules so stubbornly not falling to the floor? Well, they are trying to! But they are so light that they are being kicked back upwards by these random thermal kicks, so high that they can in principle go the many kilometers to the uppermost atmosphere.

(Of course if they could go all that way in a single kick then air would have to be so non-interactive that we could not use it to talk to each other... the mean free path in air is actually about 68 nm, so in practice every air atom is getting its random thermal kicks from other nearby air atoms. But the ultimate origin of these random thermal kicks is the random kicks of the floor on the few hundred nanometers of air sitting above it, and that energy comes from the Sun and is mostly conserved as these atoms collide with each other -- but a tiny bit is often converted to little photons of infrared light that sometimes escape the atmosphere.)

With that said as others have noticed, the free-particle energy relation in special relativity is E = γ m c². Famously, at rest, this factor γ = 1/√(1 − (v/c)²) is 1 and the energy of a particle at rest is E = m c². But as v gets closer and closer to c, v → c, this energy grows without boundary, E → ∞. So there is no finite temperature where a kinetic degree of freedom would exceed the speed of light. Indeed you can solve for v, as 1/γ² = 1 − (v/c)². So the velocity corresponding to any given total energy is v = c √(1 − (mc²/E)²). For a rest particle with E = mc² this is v = 0 as you would expect; or when the kinetic energy first gets to mc² we would have E = 2mc² and thus v = c √(3/4) = 0.866 c.


There's no finite level of kinetic energy at which the speed of an object exceeds c.


I believe you mean, "There's no finite level of kinetic energy at which the speed of an object equals c."

Exceeding c is, of course, not known to be possible at all, even with infinite energy.


Except for massless particles, which do go at the speed of light. And that's important: photons etc. are part of the thermal soup just like every thing else.


Ooh boy, quantum mechanics is fun :-)

Think of the quantum vacuum as having a large number of degrees of freedom waiting to get excited by energy -- like a fleet of unused AWS instances in a system with very effective load balancing. The moment the load (roughly, energy) on the running instances (particle present in the system aka "quanta") increases beyond the threshold for creating a new one (aka rest mass of a new particle), a new instance is spontaneously created. Heating the system is akin to increasing the load on your system, and new instances will keep getting spun up.

Is there a limit on how many such particle instances can be created? If we neglect gravity, no -- you can just keep adding instances/quanta and never run out. (and how much ever energy you dump in, the system's temperature will not increase beyond the Hagedorn limit [2])

But if you stop ignoring gravity, the gravitational attraction between the spun up instances will keep increasing as you spin up more of them, eventually forming a black hole at some point (because you cannot squeeze in more than a certain amount of information in a given volume [1]). This is roughly where you wave your hands and and come up with heuristic explanations using Planck length, Planck mass, etc.

That's the limit of current understanding. Any refinement to this story would be a massive breakthrough!

PS: A relatively sobering (nonetheless exciting) possibility is that much before gravitational effects become important, your "effective field theory" proves insufficient to model the system, and you are led to a "more fundamental" model.

[1]: http://scholarpedia.org/article/Bekenstein-Hawking_entropy

[2]: A technical explanation of the Hagedorn limit: At finite temperature, the occupation probability of states is exponentially decaying with energy (i.e. energy divided by temperature gives the log-probability) [3]. But, if the degeneracy of high-energy states grows exponentially, then that could entropically compensate for the exponential decay of the occupation probability, to have more occupation at higher energies than lower energies! The transition point in this tradeoff is the Hagedorn limit. That is why, additional energy is more likely to create new particles/states than simply increase the per-particle energy of the existing ones.

[3]: https://en.wikipedia.org/wiki/Maxwell%E2%80%93Boltzmann_stat...


Good answer from Physics SE on why there's no upper limit to temperature: https://physics.stackexchange.com/questions/1775/why-is-ther...


Would also add to this that ideas about what "temperature" means change when you are switching between equilibrium and nonequilibrium thermodynamics. In nonequilibrium thermodynamics a bunch of your equations go out the window, like the Boltzmann distribution, and all of the rules built on top of that which are many.



So +0K is the lowest low, and -0K is the highest high.


Right. In many contexts it makes more sense to use 1/T. Then there's no discontinuity and as things get hotter you decrease smoothly from positive, through 0, to negative.


I remember when I first learned about absolute zero, ages ago, and thought “if it’s unreachable, it must be like an asymptote…but that would mean we’re measuring temperature ‘upside-down’…”

Much later, I learned about that very thing (1/T, “thermodynamic beta”) while wikiwalking after hearing about the concept of negative temperature. Then I fell into a rabbit-hole wondering if we also measure speed upside-down in that way, since you can’t accelerate to light-speed. And indeed, when you’re talking about relativistic effects, units of time per distance can be more illustrative sometimes than our intuition of distance per time.


For speed we don't measure it "upside down", no. But there is another transformation: "rapidity", which does add linearly (at least for one spatial dimension) and goes to infinity when velocity goes to c.

https://en.wikipedia.org/wiki/Rapidity


Well no, I just meant that sometimes working in inverse units makes things easier to understand than the “intuitive” units you use in daily life. “Time per distance” helped me personally get a better intuitive sense of time dilation and length contraction. I hadn’t heard of rapidity before, but it’s really interesting, and makes sense after some musing—thanks!


1/0 seems like a pretty major discontinuity.


There's no discontinuity at 0. It's a limit point that is unreachable, as far as we know. That's why 1/T is a better measure. It makes it obvious that +0k is the unreachable limit of "adding a quantum of energy results in an infinite increase in entropy". (-0k is the unreachable limit of "adding a quantum of energy results in an infinite decrease in entropy".)


This is why it’s more natural to think in terms of thermodynamic beta (inverse temperature).


A temperature so hot that we have to invent a new quantum theory before we can figure out what physics would be like at that level? Yikes.


Hmm, at school we were taught that the maximum temperature is such that wavelength of the emitted black-body radiation would equal Planck length, is this rationale no longer sound?


It was never sound.

Black-body radiation describes the distribution of power emitted at given wavelengths or frequencies emitted by an ideal black body.

1. Distributions don't have one wavelength. There is a "peak" wavelength at which most power-per-wavelength is emitted, but there is still power emitted at all wavelengths no matter what temperature something is.

2. There is not sufficient reason to believe that the Planck length is a limit on the wavelength of light. This would break Lorentz symmetry.

3. Most things aren't ideal black bodies, but still have temperatures. Even if very hot objects couldn't emit light of Planck length, they can still couple to the environment and emit heat in other ways.


> maximum temperature is such that wavelength of the emitted black-body radiation would equal Planck length

... Also known as the Planck Temperature, which is the highest listed possibility for absolute hot in TFA.


This is an amazing video by vsauce called How Hot Can It Get and which deals with the same concepts. I very highly recommend it for everyone.

https://youtu.be/4fuHzC9aTik


Thanks for that, I hadnt' bumped into that channel before. After watching this one I watched the one on how to count past infinity....mind suitably blown :)

https://www.youtube.com/watch?v=SrU9YDoXE88


If we are posting great V-sauce videos, "Which way is down" is the best explanation of the relationship between space-time and gravity I have come across.

https://youtu.be/Xc4xYacTu-E


If you like this, you might enjoy: https://en.wikipedia.org/wiki/Negative_temperature


Planks Constant really changed my understanding of what the universe is made of.

Our physics and understanding of matter seems to be relevant under very specific conditions.

The moment we can make technology that can impact the smallest of sizes(if this is even possible), we might get an answer for what the universe is. Or maybe it would turn out to be "42" and it still wouldnt make sense.


Wouldn't fusion trigger at much, much lower temperatures?

We are talking about the theoretical limit of temperature, but what is the practical limit? There's a point beyond which heating hydrogen just gets you helium and more heat, but heating anything heavier than iron gets you something colder than the inputs.


Isn't fusion a reaction which has to happen? I would think that at that energy level there are no more elements which are able to form.


This video is probably the best explanation https://www.youtube.com/watch?v=oHyctwgE6m4


I have a really simple question.

Has anyone ever done an experiment to confirm that SR comes into play at ultra-high temperatures?


Note how T stands for temperature, so we can use the symbols NIL for absolute zero and T for absolute hot.


What is "simple english" below languages? Never seen this before.


It's been around for nearly 15 years. It's geared towards younger folks, folks with learning disabilities that might make it hard for them to read and people who are learning English as a second language.

https://simple.wikipedia.org/wiki/Simple_English_Wikipedia


Incidentally, for many readers of Wikipedia it may be not the language per se that is difficult to understand, so it would have been nice if there existed simpler versions of articles that otherwise may not be accessible to a lay person.


Sometimes I don't care about the theory or equations related to a topic, and just want the gist of it quickly.


Alright, thanks! Seemed to have always went past me.


https://en.wikipedia.org/wiki/Simple_English_Wikipedia

It is an intentionally simplified version of English. Although, in this case, the meaning of the article seems to have been affected by the simplification. The English version makes it clear that this is a theoretical concept, whereas the Simple English version makes it sound like something concrete/absolute.


Ironically, I found the Simple English articles I tried to read harder to understand than the regular ones. Interesting concept, though.


Ironic but inevitable; the simplification appears to mostly be a matter of vocabulary. Reducing the vocabulary usually eliminates most technical jargon, and when the subject-matter is technical, the result is, well, a mess.


or, how my Pixel XL feels in my hand after about 10 minutes of use


Looks like they found a way to measure my mix tape


If this were reddit I would've upvoted you, but this kind of cleverness should, if it constitutes the whole post, should be left to reddit.

Now, should this thread ultimately hehehe a discussion on the virtues and approaches to creating mixtapes it would be another thing, but at this point in time I'm not seeing this as a positive contribution to discussion. That is why I downvoted your genuinely amusing comment.


No worries, I knew this isn't the place for it but I couldn't resist. Thanks for your honesty!


Not that I disagree with you, but I should point out that it's a reference to the video "Absolute hot" of the YouTube channel "Casually explained".


Personally, I don't think that changes what constitutes a good discussion contribution on HN.

There's plenty of discussion here going into the details of the physics and the semantics at hand. We don't have to lower the bar for the discussion just because the topic under discussion was presented in a simple way.


tl;dr - no fun allowed.


Have your fun, just also contribute to the discussion in a meaningful way.

If you're looking exclusively for mindless fun, you're really in the wrong place. And there's nothing wrong with enjoying that kind of thing either, there are just better places to do it than HN.


[flagged]


Please don't do this here.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: