What does the concept of entropy mean?

648 views

What does the concept of entropy mean?

In: 96

28 Answers

Anonymous 0 Comments

It depends on who you ask. Ask an engineer who deals with thermodynamics you get a totally different answer to a physicist. When I had a thermodynamics course as part of my engineering degree we had one lesson dedicated ONLY to the concept of the term and discussion of it. Afterwards he had a rule about not talking about it, and just going with the terms of the equation.

In engineering entropy means that the potential difference changes. A bucket of ice in a room. As the ice melts it “takes energy” from the room. The entropy of the room decreased, but the entropy of the bucket has increased. If you consider the room and everything in it as one system, there has been no change in entropy.

While a physicist might say that increase in entropy means that structured or ordered system becomes more random.

If you ask an IT person, to them entropy means the amount of information a message is missing.

Anonymous 0 Comments

Several good examples here from the perspective of orderliness. But is it possible to describe entropy from the perspective of energy states before and after a natural phenomenon?

Anonymous 0 Comments

Imagine you’re telling me what your room looked like so I can perfectly replicate it. Entropy is roughly the number of words it’ll take for you to describe your room, in all details, to me. If your room is clean, it takes fewer words to describe. If your room is messy, more words. That’s a very technical definition of entropy that’s nonetheless eli5 hopefully.

Anonymous 0 Comments

I would like to re-ask this question to get an answer that could be explained to a five-year-old. One of the responses mentions that this is likely not possible. If that’s true, so be it.
I was taught the answer is, “The center does not hold.” Everything gets a little more complicated the further in time we go. None of the answers sound anything like this, so I’m assuming what I was taught was wrong.
Sooo, can someone actually ELI5?

Anonymous 0 Comments

So. This is from my viewpoint.

It’s the karmatic wheel. What goes around comes around.

There is order and chaos. Good and evil.

Fate and fortune along with misguided and mislead.

Anonymous 0 Comments

They explain this well in “tenet”. I would recomend you to watch it. Or maybe I misunderstood the question.

Anonymous 0 Comments

At a high level, entropy is a measure of randomness. What it really means is how many different ways can the parts of something be arranged while still looking the same as a whole.

For example, if you have a ball pit, but sort all of the balls into clumps of each color, ie all the red ones in the top left, all the blue ones in the top right etc. there are only so many ways you can rearrange the balls in that pit until a casual observer can tell that someone came by and changed the arrangement in the ball pit.

If you have the balls all mixed together, there are more ways you can shuffle the balls before an observer can tell that the arrangement was changed, meaning it has a higher entropy.

In the field of thermodynamics, we would be talking about the arrangement of atoms and molecules, and what we would be observing would be things like temperature, and density. Even though the molecules in a glass of water are constantly moving around and breaking apart and coming together, from our perspective all of these different arrangements are effectively the same overall system to us since bulk properties like the temperature and density are still the same.

Anonymous 0 Comments

Imagine you tidy up your room, over time it gets messy. It will never go back to it’s initial tidy state on it’s own. It will always get more messy. So you have to clean it to bring it back to its tidy state.

Same applies to physical systems. Assume you have a perfectly ordered system, over time, it loses it’s order slowly. It will never move itself back to the perfectly ordered state. It will always drift towards no order.

That’s entropy.

Anonymous 0 Comments

There are more ways for your room to be messy than for it to be clean. If there was an earthquake, it’s possible that your room would be perfectly put together at the end, but it’s more likely that it’ll be a mess. If it was a little bit messy before the quake, it’s probably more messy after. If there were another quake, it would get messier still, until your room is just so messy and disorderly that tossing things around in another earthquake doesn’t really make the situation any better or worse. Entropy has reached its maximum at that point.

Entropy is a measure of how many equally-messy states there are. There’s only one way for your room to be perfectly clean, and a few ways for it to be almost perfectly clean, but a ton of ways for it to be a complete disaster. Everything in chemistry moves with some element of randomness, like your belongings in an earthquake. It’s not impossible for that random action to tidy up your room, but it is very unlikely.

Anonymous 0 Comments

Entropy is NOT disorder, it’s often called that because entropy LEADS to disorder but they are not the same thing. Entropy is simply a statistic that arises because energy is more likely to spread out than to concentrate into a specific state.

For example, imagine you have 100 cups and a pitcher full of water. Now imagine you put drops of that water into the cups at random. What are the chances that *all* of the water ends up in 1 or 2 cups? It’s very slim, definitely possible, but very unlikely, what is more likely is that the water gets spread out between the cups evenly (relatively speaking of course, some cups will have more water than others). Each different way of filling the cups, for example, all the water in 1 cup, the water spread perfectly evenly between the cups, and anything in between, is called a microstate. Some microstates are more likely to occur than others (such as the water spread out evenly is more likely than the water in one cup).

Entropy is essentially just a way to measure which state is more likely. If a state is more likely its said to have high entropy. And that’s why entropy is said to always increase in a closed system, because the system will always evolve to a state that’s more likely.

Now in physics usually when people talk about enteopy they’re talking about energy. So instead of the water used in the previous analogy, it’s energy that gets distributed across the system, and instead of cups, it’s atoms and molecules and other particles and waves where the energy gets distributed in. Now one thing to remember is that there’s nothing in physics that says entropy HAS to increase, it’s just that entropy is extremely (and I can’t stress the extremely enough) likely to increase.

Edit: One example my professor gave that really resonated with me during the talk of entropy is that there is absolutely nothing in physics that is stopping all the air in the room you’re in from suddenly moving to one side and suffocating everyone in the other side. The only thing that keeps the air from doing that is probability. Its just incredibly unlikely that all the trillions of air molecules which have velocities that are more or less random would all randomly start to move in the same direction to the other side of the room, it’s much more likely that they all spread out relatively evenly in the room. That is entropy.