Wednesday, September 8, 2010

What is temperature?

In an earlier post, I wanted to explain temperature, but I think this deserves its own post.  Temperature is complicated.  It's clearly defined at the beginning of any class on thermodynamics or statistical mechanics, but I think that this definition will leave a popular audience cold (no pun intended).
Even if you understand everything about this equation (T is temperature, S is entropy, and E is energy), you would be hard-pressed to find the connection between this definition and our perceptions of "hot" and "cold".

So I'm going to explain it in the other direction, starting with hot and cold, and then working our way back to this definition.

Building a definition of temperature

I think we all intuitively understand that if we put a hot object and a cold object together, the hot object gets cooler and cool object gets hotter.  This represents an exchange of energy between the two objects.  The energy flows from the hotter object to the cooler object.  Eventually, the two objects will reach the same temperature.

But this is not the same as saying that the two objects reach the same energy.  If hot steam comes out of a teapot, the steam clearly doesn't have as much energy as the entire earth's atmosphere, yet energy will still flow from the steam to the atmosphere as the steam cools down.  It might seem like you could correct for this by considering the density of energy rather than the total energy, but this approach will also ultimately fail.

But we can definitely say one thing about temperature.  The higher the temperature of an object, the more "willing" it is to give up its energy.  The lower the temperature, the more "willing" it is to accept energy.

I want to point out a characteristic of this process that may not seem strange at first, but is.  The process of energy flowing from hot objects to cold objects cannot be reversed.  If we recorded the process, and played the recording backwards, it wouldn't make any sense.  The hot object would get hotter, and the cool object would get cooler.

This is strange because pretty much every law of physics works the same way forwards in time and backwards in time.  But there's one notable exception, the Second Law of Thermodynamics.  The Second Law states that entropy increases with time (and therefore decreases if you play time backwards).  So if a process occurs forwards, but not backwards, that means it must involve an increase entropy.  We'll get to the precise meaning of "entropy" later, but for now you just need to know it increases over time.

The exchange of energy between hot and cold objects represents a redistribution of energy to where it will contribute the most entropy.  In a cold object, a little energy goes a long way to increasing the entropy.  In a hot object, it takes a lot of energy to increase the entropy just a bit.  So in order to maximize entropy, it makes sense move energy from the hotter object to the cooler object.

Typically, the dependence of entropy on energy looks something like the blue curve below.  Note that the more energy you have, the less quickly entropy increases.  This produces a temperature which increases with energy (in purple).

However, there's no logical contradiction in having a system where the temperature decreases as the energy increases.  Nor is there logical contradiction in having a negative temperature.  But it just doesn't happen in stable systems.

What is Entropy?

Let's talk briefly about entropy (but not too long).  Entropy is usually described as a measure of disorder.  But a more precisely, entropy is a measure of how many different ways a system can be, and yet still look the same from the big picture.

For example, from the big picture, a broken egg is a broken egg.  But there are a lot of ways an egg can be broken, a lot of ways the cracks can go, and a lot of ways the shell can be scattered around.  On the other hand, there is basically one way for an egg to be intact.  So we could say that a broken egg has higher entropy than an intact egg.

But the egg is just a toy example.  Here's a more realistic one: I can look at a gas from the big picture and measure its temperature and volume and pressure.  But I can't see the motion of every individual particle.  There are a lot of different ways the particles could be moving around, and yet they'd still look the same to me.  So we can say that the gas has entropy.

Usually, the more energy a system has, the more ways there are to allocate that energy among all the little particles.  And therefore it has more entropy.

The second law of thermodynamics is based on the idea that every possible state of a system is more or less equally likely.  Therefore, from the big picture, high entropy states are more likely.

The Boltzmann Factor

Let's look at an individual particle in a gas.  The more energy this particle has, the less is left over for the rest of the gas.  The less energy the gas has, the less entropy it has.  More precisely, if you divide the cost in energy by the temperature, then you will find the cost in entropy.  The more entropy it costs, the less likely it is to occur.

But if a gas has a very high temperature, then it doesn't cost much entropy for our particle to be energetic.  So the higher the temperature, the more likely it is for a particle to have high energies.

This relationship is described in Boltzmann's Factor:

p is the probability that a single particle is in a state with energy E.  T is the temperature.  k is just a constant to make the units right.  The equation says that p exponentially decreases as E/kT.

So another way to understand temperature is as a measure of the spread in energy.  The higher the temperature, the greater the variance in energy held by each particle.  The lower the temperature, the more the particles are confined to low energies.

So now you can see that temperature is closely associated with the random energy held by individual particles in a system.  But temperature is not the same as energy.


Mark Erickson said...

Good stuff, although I'm going to have to read the Boltzmann Factor a couple more times. [How cool would it be to have a science thingy named after you? Or even several, like Boltzmann.]

Under What is Entropy, you missed a word: "So we can say that the gas has high entropy." ... right?

Larry Hamelin said...

Interesting stuff. A lot I've heard before, but haven't been able to tie together.

I can, in a sense measure the pressure of a gas by squeezing it and seeing how much it resists the squeezing. I can also measure the temperature by feeling it. What precisely is it that I'm measuring in both cases?

Another way of looking at my question: what is the simplest possible (i.e. made of psychic unobtanium, with whatever convenient properties it needs to be simple, however unrealistic) pressure sensor and simplest possible temperature sensor?

miller said...

The gas may have high or low entropy depending on what you think is high.

Pressure is easy to measure, as you said, by squeezing the gas and measuring how much force is resisting the squeezing.

Temperature is harder. I don't know anything about skin nerves, but I imagine that there's some nerve that determines whether heat is flowing in or out and at what rate. The flow of heat is usually proportional to the difference in temperature from your hand to the object. There could also be some other nerve that measures absolute temperature, perhaps by looking at the rate of some chemical reaction. Usually, high temperatures lead to faster molecules which lead to faster chemical reactions.

Thermometers measure temperature indirectly by measuring volume of some substance. You just need some substance, like mercury, whose volume increases linearly with temperature.

Larry Hamelin said...

Let me put my question another way.

I'm a very Newtonian kind of guy. I think about everything in terms of springs, levers, pistons, and little infinitely elastic steel balls bouncing around. I can very easily visualize pressure as a gas pushing on a piston compressing a spring.

Is there a way of expressing temperature in this sort of macroscopic Newtonian metaphor? Or are we in "it's just not like that" territory as we are with QM?

miller said...

Yeah, I can't really think of a way to directly measure temperature from a macro perspective. It's a measure of the spread of energy in particles.

Interestingly, the relationship between pressure and volume is similar to the relationship between temperature and entropy (they're called thermodynamic conjugates). Pressure tells you how much energy is released when you increase the volume. Temperature tells you how much energy it costs to increase the entropy. Unfortunately entropy is not something you can control macroscopically like you can volume.

Secret Squïrrel said...

Miller, with regard to temperature-sensing nerves (at least in most mammals), there are generally two that detect temperature. One detects an inflow of thermal energy ("hot") and the other an outflow ("cold").

Also, in the second para I think T should be temp.

miller said...

Right you are. Thanks for the correction.