# Why does heat increase entropy?

106 views

My brain can’t understand this at all. In an isolated system with cold molecules on one side and hot molecules on the other, I understand how the heat of this system balances with time, but it’s confusing to me that a system with a more evenly distributed temperature has increased randomness when it appears that there’s more order.

This feels like it should be simple, but my brain simply isn’t getting it, no matter how many analogies or examples I read. I’ve got to be missing something very simple, and that’s why it’s so frustrating that I don’t understand it. This is seriously stressing me out.

In: Physics

Because the more energy a system has, the more things that energy can be doing — the more states things can be in. If there is little energy in a closed system, there fewer possible variations of things.

Think of it this way. What is a more random situation:

A box with 50 bouncy balls, where 25 are bouncing around but the other 25 are just kinda rolling around the bottom. Or a box where all 50 balls are bouncing around (not with the same intensity as the 25 from the first box, but still bouncing around)

The one where more balls are bouncing around is more random, right?

Also, consider this. You think that the evenly heated box is more orderly because it’s uniform right, it is just one big group? But it isn’t, it’s actually many many many small molecules in reality. When half the molecules are at cold temperates and not moving around as much, those ones are way easier to keep track of than when they are moving around a lot.

It’s the difference between counting a crowd of people when they are all stood still for you, vs when everyone is walking around and mixing together.

Entropy represents the disorder in a system – that has a VERY SPECIFIC, scientific meaning that is not the same as the every day meaning of the word.

Under that definition, ORDER represents structures with useful function. In terms of energy, it means energy that is available and able to be used to produce work. DISORDER represents energy that has no useful function. The more uniformly energy is distributed in equilibrium, the more disorder the system has – the system (*Of energy*) is not ordered, and not useful.

The distribution of the heat is orderly (in the every day meaning of the word). The system of energy, for the purposes of its use, is disordered because it cannot be used.

The whole idea of using the word disorder to describe entropy does have some criticism. I prefer to think of entropy as how easy energy is to use to do stuff. That’s mostly what “orderly” really means.

The same total energy can be in two nearly identical systems, but the one with high entropy can’t be used to do stuff. E.g., a heat gradient (one cold side, one hot side) is an ingredient for a heat engine. Engines perform work. You can use the energy to power a pump, or move an object (assuming you build something to use a heat engine).

In contrast, turning the energy of a cloud with the same temperature everywhere into useful work is difficult. Can’t make a heat engine! You’d have to put EXTRA work just to get energy out of it. E.g., splitting atoms in the gas cloud (you’d waste more energy than you got in return).

Orderly systems, aka systems that require complexity and organization, like life, require useful energy to exist. E.g., your body needs to gather nutrients in specific places. Higher entropy would be those nutrients being evenly distributed everywhere instead of where they need to be. Nutrients being where they need to be is orderly. Your body uses energy to fight entropy and move nutrients where they need to be.

So, entropy has a mathematical definition (well, several, but this one applies here). Entropy is not just “disorder,” it’s “the number of ways a system can be arranged.”

Entropy= k*ln(W), where k is a constant and W is the number of ways a system can be arranged.

So imagine a box which perfectly fits 10 balls. They are all identical white balls, so no matter what you do, there is only 1 way to arrange them. Log of 1 is 0 so there is literally zero entropy.

If you swap 1 white ball for a black ball, now the black ball could be in 10 different “otherwise identical states.” So there would be 10 ways to arrange this system.

If you had 5 white balls on one side, and 5 black balls on the other side, again there would only be one way to arrange this. If you allow 1 ball if each color to cross over to the other side, the number of ways to arrange it increases.

So statistically, if you put a bunch of balls in a box and shook them up, it’s some million times more likely that they would be mixed up, rather than evenly divided on each side. But there’s not some “force” pushing it this way.

You can imagine the balls like atoms, and heat is like shaking it up. They will settle in a “more likely” configuration, which is how we define maximum entropy.