ELI5-What is entropy?

746 views

ELI5-What is entropy?

In: 1763

33 Answers

Anonymous 0 Comments

Entropy is that everything wants to become more bland. Undefinied, equal temperature and in a state of equilibrium where no action can be taken because everything is at the lowest energy possible. Although some say “increasing chaos”, it’s really “increasing blandness” as the universe will be eventually equal temperature, equal everything.

Let’s use temperature as an example. Hot water dumps that energy to become room temperature. Room dumps energy into the atmosphere to become atmosphere temperature. Atmosphere dumps energy into space to become space temperature. You want to become not-the-same-temperature as everything around you? Gotta spend energy (like a fridge, oven, AC-system, fire etc)

The main source of “not becoming space temperature” is the sun, but the sun is burning 4 million tons of mass every second to do that and won’t last forever (and eventually it will die out, and long after that the suns mass will form a part of a new star, but that process can’t go on forever either).

Anonymous 0 Comments

Picture a simple rollercoaster with hills and valleys all at different heights to one another but the beginning and end are the highest points on it. Entropy says no matter what happens, eventually the coaster will settle into the lowest valley available. Sure, if given a push it can be moved uphill to another higher valley and settle there, but even then eventually due to random chance (a freak gust of wind, a branch falls and pushes the cart just the right way) the cart will inevitably get back to the bottom. It’s happiest there. Once there, there is literally zero chance of it getting any lower and it takes effort to do anything else. Everywhere else on the rollercoaster there is some chance, no matter how small that it could move to the lowest point. This inexorable pull towards the lowest energy state possible is entropy.

What this inevitably translates to is everything in the universe that is at an elevated energy state will decay. Your made bed gets disheveled, Buildings crumble, metal rusts, atoms decay, black holes evaporate, and so on until eventual the entire universe is one homogeneous thermodynamically neutral medium where there is literally no more rollercoaster to coast.

Anonymous 0 Comments

Entropy is a measure of the number of ways in which a system can exists. The most intuitive example (to me) is a solid versus a gas. In a solid, the molecules (or atoms) are held rigidly in place with little ability to move around. In a gas, the molecules (or atoms) can freely fly around in space and can move all over the place. So the gas has more entropy because the molecules that make up that gas can exist in space in more ways by moving around freely.

Admittedly, this isn’t the best ELI5 explanation, but I hope it helps.

Anonymous 0 Comments

I’ve always thought about it as process irreversibility. Things don’t naturally get more ordered over time. For example, think about a desk that you work at. If that desk starts clean and orderly, it will inherently become disordered over time, unless you take a specific action to reset/clean it.

I hope that helps a little. Entropy is a very abstract concept, but at the end of the day it’s just a mathematical concept that shows processes cannot be fully reversed.

Anonymous 0 Comments

You know how your earphones seem to get tangled a lot?

It’s all about statistics. Your earphones have more ways to be tangled than untangled, therefore they will more often than not become tangled.

Why is that special? Because it shows a one-way tendency, a natural “push” from one state to another. That’s entropy.

Anonymous 0 Comments

Entropy is a concept that initially was just something physicists cooked up for 2 reasons:

1. To have some benchmark for heat engine efficiency. (See Ideal/Carnot Heat Engines)

2. To definitively falsify the possibility of machines that could be in perpetual motion. (Lots of charlatans would claim they invented free energy systems and cheat people out of their money)

It was then later crowned as the “2nd law of thermodynamics” (i.e we recognized it as fundamental as energy conservation) and we have been noticing that although entropy (just like energy conservation) is a classical description, in some form it appears all over nature. (There is a very recent paper from L. Susskind et al., where they show that even complex systems could theoretically exhibit something analogous to entropy)

Entropy has many definitions, but the most common you’ll see is: the quantitative measure of a systems order / disorder and the most common definition for order/disorder is the number of states available on the microscopic level for a given macroscopic state. The less microscopic states available, the lower the entropy and as these states increase the entropy increases until it hits a maximum. We define this maximum entropy as thermal equilibrium (where things get very boring).

Anonymous 0 Comments

Suppose you have a container of fluid. Half the container is filled with hot fluid while the other is filled with cold and there is a divider in between. Now, suppose you remove the barrier and allow the hot and cold fluid to mix. Once the two parts of the fluid have fully mixed, it will all be the same temperature. The entropy has increased: even though the total amount of heat in the system has remained the same, there is no more *free energy*.

That is, if there is a *cold load* and a *hot load*, then you can use the difference in temperature to do work: to lift a weight or move a vehicle forward. To do this, you would use a *heat engine*. The second law of thermodynamics says that you cannot do work by moving heat from a cold spot to a warm spot. Conversely, you cannot move heat from a cold spot to a warm spot without expending energy. This is the same principle that governs the flow of entropy.

The process of allowing the hot and cold fluids to mix is an *irreversible* one. That is, you cannot easily separate out the hot (high energy) and cold particles (low energy) to return the system to its original configuration. You can expend energy to return it to a similar configuration, but the hot and cold particles will be different than in the original configuration. An irreversible process always increases the entropy of a system.

Anonymous 0 Comments

The statistical definition of entropy is that it is proportional to the number of microstates that can make up a macrostate.
Now what is a macrostate: a hamburger. There are different kinds of hamburgers, but lets say you consider a cheeseburger. Thats our macrostate. What is a microstate: the way you put the parts in the sandwich. You can put the cheese on the lettuce or the lettuce on the cheese. The tomato can be on the top or if you are very energetic the meat patty can be on top, but it is still a cheeseburger in the end. So how many ways can you make a cheeseburger? A lot of ways. So entropy of cheeseburger is high. What if I restricted you to only buns, 1 meat patty and 1 slice of cheese. Well now the ways you can make the cheeseburger is quite limited, so the entropy of this restricted cheeseburger is low.

So entropy tells us how many ways can a system be built from its parts.

Anonymous 0 Comments

Entropy is often defined as the measure of disorder of a system, but this definition is misleading because the universe could not care less about the human concept of order. Order in this case has more to do with the ability to change: when entropy is maximum no change is possibile and the system ~~is out of energy~~ energy is evenly distributed. Also, the information needed to describe the system is at its maximum.

So, I always thought of entropy to actually measure the ever-decreasing ability of the universe to change. If the process cannot be reversed, the final fate of the universe will be a cold and dark immutability.

Entropy also gives us the arrow of time, but this is another topic altogether.

Anonymous 0 Comments

Pretend your freshly built Lego castle is the universe. Then I kick it until it breaks into its component parts. Entropy.