The following is an excerpt from our Lost in Space-Time newsletter. Each month we delve into fascinating ideas from around the universe. You can log in to Lost in Space-Time here.
One of the most absurd things about science is that you can spend years studying and reading about the deepest mysteries of the universe—dark matter, quantum gravity, the nature of time—and still be blown away by something deceptively simple. Nobel Prize-winning theoretical physicist Richard Feynman famously admitted that as a student he didn’t really understand why mirrors flip images left to right and not up and down. I’m no Feynman; I know how mirrors work. But I had my own humbling reckoning with the obvious: temperatures.
We’ve known that things can be hot or cold since the first cave child stuck his hand into a fire and was yelled at by a concerned parent. But what we mean by temperature has changed a lot over the centuries and is still evolving today as physicists push it into weirder, quantum corners.
My own brush with it came through my partner who once asked, “My beautiful and amazingly intelligent wife, haven’t you studied physics? So tell me, can a single particle have a temperature?” I may be paraphrasing slightly here, but that was essentially his question.
His initial hunch was right: no, he couldn’t, not really. Most science geeks know that temperature isn’t something you can assign to just one particle. Business in hot and cold only makes sense as a property of systems with many, many particles—things like gas-filled pistons, coffee pots, or stars. This is because temperature, as we commonly define it, is a kind of shorthand. It captures the average energy of the system’s microscopic components as they bounce off and distributes its energy evenly until it reaches a state known as equilibrium.
Think of it like a ladder, where each rung represents a different energy level. The higher the bar, the more energy the particle has. When there are a lot of particles, we expect them to be distributed across the bars in a predictable manner. Most particles settle to the bottom, a few have enough energy to climb one rung higher, and fewer still higher. The result is a smooth, decreasing particle count as you progress up the ladder.
But why do we define temperature this way? Sure, it’s the average, but there’s nothing in the math that compels us to take the average of a data set with a single point. If there is one tall person in the room, we don’t blink when we say the average height of the people in that room is 6 feet. Why not do the same here?
This is because temperature is not only descriptive but also predictive. For scientists trying to harness the power of fuel, fire, and steam in the 17th and 18th centuries, it was most useful for temperature to tell them what happens when two systems interact.
This gave rise to the zeroth law of thermodynamics, the last of these laws to be established, but the most fundamental. It goes like this: if the thermometer reaches 80°C in a cup of warm water and also reaches 80°C in a cup of warm milk, then when we mix the two liquids, there should be no net exchange of heat between them. This may sound obvious – even trite – but it is the basis of classical thermometry.
And this is only true because large systems behave in a statistically stable manner. Tiny energy fluctuations between particular particles wash out, and the law of large numbers allows us to write generalizable results.
Thermodynamics is special in this respect. Unlike, say, Isaac Newton’s laws of motion, which work well for one falling apple or a thousand, the laws of thermodynamics only appear at scale. They rely on averages, ensembles, and the mathematical magic that happens when your particle count climbs into the billions.
So: individual particles have no temperature. Case closed.
Now I know I thought that. But just when I felt ready to move on, physics threw me a curveball. The first clue that things are going to get really weird is that many quantum systems are made up of a very small number of particles that never have stable properties.
Tiny systems—like single atoms or singular spins—can be trapped states that never actually settle. Some are even deliberately engineered to completely resist a peaceful state of equilibrium. So if temperature is supposed to describe what happens when things cool down, doesn’t our definition of temperature fall apart?

What exactly is temperature?
fhm/Getty Images
Physicists have been hard at work re-tuning temperature from the ground up, considering what it even means to have temperature in the quantum realm.
In the same vein as the pioneers of thermodynamics, researchers now ask not what temperature is, but what it does. If we take a quantum system and attach it to something else, in which direction does the heat move? Can the system warm its neighbor? Can it be cooled?
In the quantum world, the answer can be both! Let’s go back to the temperature ladder that particles can climb. In the classical world, the temperature rules here are simple. When two ladders (two systems) interact, energy always flows from the system with more particles on the higher rungs to the system with fewer.
But the quantum system does not follow the same rules. Quantum systems cannot have any particles on the bottom rung and instead have them all crammed into the rungs above. They could have an uneven distribution of particles evenly distributed on all bars. Superposition also allows particles to exist between the bars. When quantum mechanics comes into play, our ladder is no longer what physicists call “thermally ordered.”
This makes it difficult to predict how heat might flow if one ladder interacts with something. To deal with this, physicists developed a curious solution: let quantum systems have two temperatures. Imagine a kind of reference ladder that represents a simple thermal system. One temperature tells you the hottest such ladder your system can still pull heat down from. The second tells you the coolest rung your system can push heat to. Outside this barrier, heat flows in a predictable direction, but inside the result depends on the exact nature of the quantum system. It’s the new zeroth law of thermodynamics, something that can help us restore the logic of how heat flows in the quantum world.
These two limits reflect the system’s potential to give or take energy regardless of whether it is in a state of equilibrium. Crucially, these temperatures depend not only on the energy, but on how that energy is structured: how the quantum particles or states are distributed in energy levels and what transitions the whole system supports.
And like their thermodynamic predecessors, quantum physicists are interested in making their systems work. Imagine two atoms that are entwined—their properties are so closely correlated that measuring one affects the other. Now expose one atom to the environment. When an atom gains or loses energy, it pulls on the invisible quantum bond connecting the pair. Breaking or defacing that link costs something, like snapping a stretched rubber band. This creates a heat flow that would not occur without the quantum entanglement, which can then be used – by connecting the atom to a small quantum “piston” – to do work until the entanglement is used up. By assigning hot and cold effective temperatures to any quantum state, researchers can determine when the system can reliably transfer heat, extract work, or manage tasks such as cooling and computing.
If you’ve made it this far, here’s my confession: I argued with my partner that a single particle could have a temperature, even though his intuition was correct. Being the patient loser that I am, I spiraled down a big rabbit hole – and at the bottom I found out that we were both kind of right. A single particle cannot have on temperature, but can have two.
topics:
- quantum physics/
- Lost in space-time

Leave a Reply