Entropy is the statistical measure of disorder in a system.
Say you have two rooms with a well-insulated door between them. One room has been heated until its very warm. The other room has been chilled until its very cold.
This is a highly ordered system — all the hot air (rapidly moving molecules) is on one side of the door and all of the cold air (slowly moving molecules) is on the other side of the door.
That would be a *low* entropy system because the amount of disorder is low. There’s nothing random about this system. We would know where to look to find rapidly moving molecules or slowly moving molecules. And something (in this case, a heater and a chiller) has acted on the system to create that order.
Now, let’s say you open the door between Hot Room and Cold Room so there is no longer a barrier between them.
What happens?
You know that intuitively. What happens is that over time you go from one Hot Room and one Cold Room to two Medium Temperature Rooms.
Now you have a *high* entropy system because the amount of disorder in high. No longer are all the rapidly moving molecules in one area and the slowly moving molecules in another area. In fact if you were asked to predict which of the two rooms had *more* rapidly moving molecules, you wouldn’t be able to do it because of the randomness of the distribution of any remaining rapidly moving molecules among the two rooms.
Side note: what we’ve just described here is Newton’s Second Law of Thermodynamics, which says, in brief, that over time the organization of a system always moves in the direction of increasing, higher entropy.
Latest Answers