In an earlier post, I wanted to explain temperature, but I think this deserves its own post. Temperature is complicated. It's clearly defined at the beginning of any class on thermodynamics or statistical mechanics, but I think that this definition will leave a popular audience cold (no pun intended).Even if you understand everything about this equation (T is temperature, S is entropy, and E is energy), you would be hard-pressed to find the connection between this definition and our perceptions of "hot" and "cold".
So I'm going to explain it in the other direction, starting with hot and cold, and then working our way back to this definition.
Building a definition of temperature
I think we all intuitively understand that if we put a hot object and a cold object together, the hot object gets cooler and cool object gets hotter. This represents an exchange of energy between the two objects. The energy flows from the hotter object to the cooler object. Eventually, the two objects will reach the same temperature.
But this is not the same as saying that the two objects reach the same energy. If hot steam comes out of a teapot, the steam clearly doesn't have as much energy as the entire earth's atmosphere, yet energy will still flow from the steam to the atmosphere as the steam cools down. It might seem like you could correct for this by considering the density of energy rather than the total energy, but this approach will also ultimately fail.
But we can definitely say one thing about temperature. The higher the temperature of an object, the more "willing" it is to give up its energy. The lower the temperature, the more "willing" it is to accept energy.
I want to point out a characteristic of this process that may not seem strange at first, but is. The process of energy flowing from hot objects to cold objects cannot be reversed. If we recorded the process, and played the recording backwards, it wouldn't make any sense. The hot object would get hotter, and the cool object would get cooler.
This is strange because pretty much every law of physics works the same way forwards in time and backwards in time. But there's one notable exception, the Second Law of Thermodynamics. The Second Law states that entropy increases with time (and therefore decreases if you play time backwards). So if a process occurs forwards, but not backwards, that means it must involve an increase entropy. We'll get to the precise meaning of "entropy" later, but for now you just need to know it increases over time.
. So in order to maximize entropy, it makes sense move energy from the hotter object to the cooler object.
Typically, the dependence of entropy on energy looks something like the blue curve below. Note that the more energy you have, the less quickly entropy increases. This produces a temperature which increases with energy (in purple).
However, there's no logical contradiction in having a system where the temperature decreases as the energy increases. Nor is there logical contradiction in having a negative temperature. But it just doesn't happen in stable systems.
What is Entropy?
Let's talk briefly about entropy (but not too long). Entropy is usually described as a measure of disorder. But a more precisely, entropy is a measure of how many different ways a system can be, and yet still look the same from the big picture.
For example, from the big picture, a broken egg is a broken egg. But there are a lot of ways an egg can be broken, a lot of ways the cracks can go, and a lot of ways the shell can be scattered around. On the other hand, there is basically one way for an egg to be intact. So we could say that a broken egg has higher entropy than an intact egg.
But the egg is just a toy example. Here's a more realistic one: I can look at a gas from the big picture and measure its temperature and volume and pressure. But I can't see the motion of every individual particle. There are a lot of different ways the particles could be moving around, and yet they'd still look the same to me. So we can say that the gas has entropy.
Usually, the more energy a system has, the more ways there are to allocate that energy among all the little particles. And therefore it has more entropy.
The second law of thermodynamics is based on the idea that every possible state of a system is more or less equally likely. Therefore, from the big picture, high entropy states are more likely.
The Boltzmann Factor
Let's look at an individual particle in a gas. The more energy this particle has, the less is left over for the rest of the gas. The less energy the gas has, the less entropy it has. More precisely, if you divide the cost in energy by the temperature, then you will find the cost in entropy. The more entropy it costs, the less likely it is to occur.
But if a gas has a very high temperature, then it doesn't cost much entropy for our particle to be energetic. So the higher the temperature, the more likely it is for a particle to have high energies.
This relationship is described in Boltzmann's Factor:
So another way to understand temperature is as a measure of the spread in energy. The higher the temperature, the greater the variance in energy held by each particle. The lower the temperature, the more the particles are confined to low energies.
So now you can see that temperature is closely associated with the random energy held by individual particles in a system. But temperature is not the same as energy.