# What does the concept of entropy mean?

110 views

What does the concept of entropy mean?

In: 96

The most intuitive way to think of entropy is to consider it as a measure of how disordered a system is.

For example, consider a fresh pack of cards. It’s in one order and one order only, and it’s very easy to tell when the deck is no longer in that order. The fresh pack has very low entropy, because there’s only the one arrangement it can be in.

Now, if you shuffle the deck, so that the cards are completely randomized, you’ve raised the entropy of the deck. You can rearrange the individual cards very freely without damaging your ability to say “Yes, that’s a shuffled pack of cards.”

Another way to think of entropy is the ability to pull useful work out of a system. For example, you need a temperature difference to do any work with a heat engine; if there’s no gradient, nothing’s going to want to move from point A to point B. You need *low entropy*, a condition of order and being able to say “This is different than that,” in order to perform work.

All information in the universe becoming less orderly over time, carbon reacts, atoms decay etc , in early universe was all the same type of matter and now… Complex. Very difficult if not impossible to put it back together the way it was

A kettle full of boiling water and a glass of ice water both will become lukewarm with enough time.

It basically describes the observed tendecy of physical systems to disappate energy over time through various random effects, and thus imperically disproving a type of perpetual motion machine.

To give an easier to understand analogy think of a card pyramid that is low entropy system because the mechanical stress each card expiriences (analog for energy) is different, now let’s say that due to vibrations or drafts (analog for various effects) that pyramid crumbles, now that is a high entropy system because more or less each card now expiriences the same level of mechanical stress.

The reason entropy is sometimes refer to as randomness is a same energy system is a lot harder to predict because each element in that system can move and jitter on it’s own; to go back to the analogy; there aren’t many ways to build a card pyramid, but the crumble pattern is always unique.

You can set off a stick of dynamite. But you can’t put it back together again. The amount of disorder in the universe always increases.

Well, technically you could take all of the gasses etc produced in the explosion, refine them, process them, and after a lot of work end up with a stick of dynamite again. But by introducing a lot of order to that random swirl of gasses, you will introduce even more disorder in the rest of the universe – e.g. you’re burning stuff to generate electricity to power the machines, you likely have used pure chemicals as part of your reactions and now you have a load of mixed chemical waste, etc. The amount of disorder in the universe always increases.

The “disorder” in this example is technically called “entropy”.

One way to think about entropy is from statistical point of view. This basically says that if there are more ways of arranging things which result in the same outcome, the entropy is higher. So for example you have 10 balls, 5 identical black and 5 identical white. You arrange them in a row. There are exactly 2 ways to arrange them in an alternating way. Also only 2 ways of having 5 of the same color touching. But let’s say I want 5 black balls always grouped and the white balls wherever. There are exactly 5 ways to arrange that, therefore the state of “5 black balls touching” has higher entropy than “balls alternating”. And so on.

Another example is 2 die system. When you roll them, you can achieve combined result of 7 in many ways, while for results of 2 and 12 there is only one way for each of them.

In a physical world, imagine a perfect crystal. So the atoms are ideally arranged where they should be. There is only 1 way of doing that therefore the lowest entropy. Nature does not like low entropy, so in real world this is hard or impossible to achieve, depending on the scale. Therefore, there are always some defects to that structure which introduce some entropy, which in turn lowers the overall energy of the system. What I am trying to describe here is more or less thermodynamic potential, called Gibbs free energy if you want more reading.

Entropy is the inherent randomness/chaos in a system that tends to bring it to a neutral state.

Imagine a garden; perfectly trimmed and weeded with rows of flowers and plants that you find desirable. This state of the garden is considered to have very low entropy as this is not a naturally occurring situation and you have manufactured order. Now imagine you let a month go by without impacting the garden. Odds are some weeds have sprouted throughout the area as well as the grass having grown quite a bit. Perhaps some unkempt bushes stick out over your flowers, crushing some in the process. Entropy has brought the area closer to its neutral state (that of the original landscape, be it prairie, woodland, etc.). Now let a year or two go by, and you may have difficulty seeing any semblance of a garden existing in the first place. People equate entropy with chaos but it is more akin to natural decay, with order disturbing the balance.

This process is the easiest to imagine entropy with, but it goes much further than that. The true neutral state for the universe is… nothingness. Eventually the Sun will burn out, the Earth’s and all the planets’ cores will cool to absolute zero, and all atoms will cease to move. This is the fate of an expanding universe.

Entropy.

Such a simple word, but so complex in what it is.

Okay. Let’s say I have a perfectly nice stack of blocks, stacked vertically, ordered by A, B, C, D, E, and so on. This is a state of low entropy.

Then I smashed the stack.

Now we see blocks scattered all over the floor, randomly. This is a state of higher entropy.

How do we get the stack back in order? We spend work to put it back, or we can reset the universe (or open up the menu and press Load).

Entropy is the same. You can spend work and try to reduce entropy, or simply go back in time.

Entropy as laymen can understand is a representation of disorderliness of a state of things in a particular space. It naturally goes up (unless you’re rewinding time). You have to put some work in so it comes back down.

(And whoever playing jumbo jengas at 2AM in a fucking loud party next to my apartment, you can go fuck yourself. The noise of jenga entropy is too damn loud.)

Entropy is NOT disorder, it’s often called that because entropy LEADS to disorder but they are not the same thing. Entropy is simply a statistic that arises because energy is more likely to spread out than to concentrate into a specific state.

For example, imagine you have 100 cups and a pitcher full of water. Now imagine you put drops of that water into the cups at random. What are the chances that *all* of the water ends up in 1 or 2 cups? It’s very slim, definitely possible, but very unlikely, what is more likely is that the water gets spread out between the cups evenly (relatively speaking of course, some cups will have more water than others). Each different way of filling the cups, for example, all the water in 1 cup, the water spread perfectly evenly between the cups, and anything in between, is called a microstate. Some microstates are more likely to occur than others (such as the water spread out evenly is more likely than the water in one cup).

Entropy is essentially just a way to measure which state is more likely. If a state is more likely its said to have high entropy. And that’s why entropy is said to always increase in a closed system, because the system will always evolve to a state that’s more likely.

Now in physics usually when people talk about enteopy they’re talking about energy. So instead of the water used in the previous analogy, it’s energy that gets distributed across the system, and instead of cups, it’s atoms and molecules and other particles and waves where the energy gets distributed in. Now one thing to remember is that there’s nothing in physics that says entropy HAS to increase, it’s just that entropy is extremely (and I can’t stress the extremely enough) likely to increase.

Edit: One example my professor gave that really resonated with me during the talk of entropy is that there is absolutely nothing in physics that is stopping all the air in the room you’re in from suddenly moving to one side and suffocating everyone in the other side. The only thing that keeps the air from doing that is probability. Its just incredibly unlikely that all the trillions of air molecules which have velocities that are more or less random would all randomly start to move in the same direction to the other side of the room, it’s much more likely that they all spread out relatively evenly in the room. That is entropy.

There are more ways for your room to be messy than for it to be clean. If there was an earthquake, it’s possible that your room would be perfectly put together at the end, but it’s more likely that it’ll be a mess. If it was a little bit messy before the quake, it’s probably more messy after. If there were another quake, it would get messier still, until your room is just so messy and disorderly that tossing things around in another earthquake doesn’t really make the situation any better or worse. Entropy has reached its maximum at that point.

Entropy is a measure of how many equally-messy states there are. There’s only one way for your room to be perfectly clean, and a few ways for it to be almost perfectly clean, but a ton of ways for it to be a complete disaster. Everything in chemistry moves with some element of randomness, like your belongings in an earthquake. It’s not impossible for that random action to tidy up your room, but it is very unlikely.

Imagine you tidy up your room, over time it gets messy. It will never go back to it’s initial tidy state on it’s own. It will always get more messy. So you have to clean it to bring it back to its tidy state.

Same applies to physical systems. Assume you have a perfectly ordered system, over time, it loses it’s order slowly. It will never move itself back to the perfectly ordered state. It will always drift towards no order.

That’s entropy.

At a high level, entropy is a measure of randomness. What it really means is how many different ways can the parts of something be arranged while still looking the same as a whole.

For example, if you have a ball pit, but sort all of the balls into clumps of each color, ie all the red ones in the top left, all the blue ones in the top right etc. there are only so many ways you can rearrange the balls in that pit until a casual observer can tell that someone came by and changed the arrangement in the ball pit.

If you have the balls all mixed together, there are more ways you can shuffle the balls before an observer can tell that the arrangement was changed, meaning it has a higher entropy.

In the field of thermodynamics, we would be talking about the arrangement of atoms and molecules, and what we would be observing would be things like temperature, and density. Even though the molecules in a glass of water are constantly moving around and breaking apart and coming together, from our perspective all of these different arrangements are effectively the same overall system to us since bulk properties like the temperature and density are still the same.

They explain this well in “tenet”. I would recomend you to watch it. Or maybe I misunderstood the question.

So. This is from my viewpoint.

It’s the karmatic wheel. What goes around comes around.

There is order and chaos. Good and evil.

Fate and fortune along with misguided and mislead.

I would like to re-ask this question to get an answer that could be explained to a five-year-old. One of the responses mentions that this is likely not possible. If that’s true, so be it.
I was taught the answer is, “The center does not hold.” Everything gets a little more complicated the further in time we go. None of the answers sound anything like this, so I’m assuming what I was taught was wrong.
Sooo, can someone actually ELI5?

Imagine you’re telling me what your room looked like so I can perfectly replicate it. Entropy is roughly the number of words it’ll take for you to describe your room, in all details, to me. If your room is clean, it takes fewer words to describe. If your room is messy, more words. That’s a very technical definition of entropy that’s nonetheless eli5 hopefully.

Several good examples here from the perspective of orderliness. But is it possible to describe entropy from the perspective of energy states before and after a natural phenomenon?

It depends on who you ask. Ask an engineer who deals with thermodynamics you get a totally different answer to a physicist. When I had a thermodynamics course as part of my engineering degree we had one lesson dedicated ONLY to the concept of the term and discussion of it. Afterwards he had a rule about not talking about it, and just going with the terms of the equation.

In engineering entropy means that the potential difference changes. A bucket of ice in a room. As the ice melts it “takes energy” from the room. The entropy of the room decreased, but the entropy of the bucket has increased. If you consider the room and everything in it as one system, there has been no change in entropy.

While a physicist might say that increase in entropy means that structured or ordered system becomes more random.

If you ask an IT person, to them entropy means the amount of information a message is missing.

Others have pointed out that it is a measure of the order/disorder in a system. But I think that is not quite getting to the point.

The point is that it is a way to know how much work can be done in a closed system.

First, it is easy to misunderstand “closed system”. This is a concept of an isolated hunk of stuff where nothing comes in, and nothing goes out. Well, there is no such thing other than (arguably) the entire universe.

For example, the Earth is sitting in space, not touching anything, but is not closed because energy comes from the sun as radiation particles. And the Earth radiates lower heat energy, and has particles stripped away.

So this is more of a thought experiment, and in practice, a way to find inputs to a system when those inputs may not be obvious.

So… As a thought experiment: if you make a closed system out of the earth, then the plant life will all die from lack of light, then everything else will die in a chain reaction. That is not the interesting bit. Over time, the crust of the Earth would heat up and melt. Why? In Earth’s current state, the core is the hottest due to several processes, and the crust is cool because it is able to radiate heat to space. In a “closed system Earth”, Earth can no longer radiate out heat, so the entire Earth becomes the same temperature over time. And that temperature is high enough to melt the crust.

This is more interesting than it might seem at first. Most power generation has to do with temperature differences. The simple example is a steam engine. It is the concentrated heat on one side that provides the energy. In our “closed system Earth”, one the heat has equalized, there is much less opportunity to create such engines. The Earth would be in a higher entropy state. Less work can be accomplished.

In the real world, we can literally dump water into a deep hole, have earth heat the water to steam, and use the steam to run a generator. This is a lower entropy state. More work can be accomplished.

I think Brian Cox has good explanation in one of his shows.

If you build a sand castle is a desert it as highly ordered structure (low entropy) but over time it will revert back to just grains of sand (high entropy).

However over the the time the desert has existed the sand has never been randomly arranged as a sand castle. It is not impossible just very unlikely that would happen because naturally thing seems to go from low to high entropy.