What does the concept of entropy mean?

553 views

What does the concept of entropy mean?

In: 96

28 Answers

Anonymous 0 Comments

The most intuitive way to think of entropy is to consider it as a measure of how disordered a system is.

For example, consider a fresh pack of cards. It’s in one order and one order only, and it’s very easy to tell when the deck is no longer in that order. The fresh pack has very low entropy, because there’s only the one arrangement it can be in.

Now, if you shuffle the deck, so that the cards are completely randomized, you’ve raised the entropy of the deck. You can rearrange the individual cards very freely without damaging your ability to say “Yes, that’s a shuffled pack of cards.”

Another way to think of entropy is the ability to pull useful work out of a system. For example, you need a temperature difference to do any work with a heat engine; if there’s no gradient, nothing’s going to want to move from point A to point B. You need *low entropy*, a condition of order and being able to say “This is different than that,” in order to perform work.

Anonymous 0 Comments

All information in the universe becoming less orderly over time, carbon reacts, atoms decay etc , in early universe was all the same type of matter and now… Complex. Very difficult if not impossible to put it back together the way it was

Anonymous 0 Comments

A kettle full of boiling water and a glass of ice water both will become lukewarm with enough time.

Anonymous 0 Comments

It basically describes the observed tendecy of physical systems to disappate energy over time through various random effects, and thus imperically disproving a type of perpetual motion machine.

To give an easier to understand analogy think of a card pyramid that is low entropy system because the mechanical stress each card expiriences (analog for energy) is different, now let’s say that due to vibrations or drafts (analog for various effects) that pyramid crumbles, now that is a high entropy system because more or less each card now expiriences the same level of mechanical stress.

The reason entropy is sometimes refer to as randomness is a same energy system is a lot harder to predict because each element in that system can move and jitter on it’s own; to go back to the analogy; there aren’t many ways to build a card pyramid, but the crumble pattern is always unique.

Anonymous 0 Comments

You can set off a stick of dynamite. But you can’t put it back together again. The amount of disorder in the universe always increases.

Well, technically you could take all of the gasses etc produced in the explosion, refine them, process them, and after a lot of work end up with a stick of dynamite again. But by introducing a lot of order to that random swirl of gasses, you will introduce even more disorder in the rest of the universe – e.g. you’re burning stuff to generate electricity to power the machines, you likely have used pure chemicals as part of your reactions and now you have a load of mixed chemical waste, etc. The amount of disorder in the universe always increases.

The “disorder” in this example is technically called “entropy”.

Anonymous 0 Comments

One way to think about entropy is from statistical point of view. This basically says that if there are more ways of arranging things which result in the same outcome, the entropy is higher. So for example you have 10 balls, 5 identical black and 5 identical white. You arrange them in a row. There are exactly 2 ways to arrange them in an alternating way. Also only 2 ways of having 5 of the same color touching. But let’s say I want 5 black balls always grouped and the white balls wherever. There are exactly 5 ways to arrange that, therefore the state of “5 black balls touching” has higher entropy than “balls alternating”. And so on.

Another example is 2 die system. When you roll them, you can achieve combined result of 7 in many ways, while for results of 2 and 12 there is only one way for each of them.

In a physical world, imagine a perfect crystal. So the atoms are ideally arranged where they should be. There is only 1 way of doing that therefore the lowest entropy. Nature does not like low entropy, so in real world this is hard or impossible to achieve, depending on the scale. Therefore, there are always some defects to that structure which introduce some entropy, which in turn lowers the overall energy of the system. What I am trying to describe here is more or less thermodynamic potential, called Gibbs free energy if you want more reading.

Anonymous 0 Comments

Entropy is the inherent randomness/chaos in a system that tends to bring it to a neutral state.

Imagine a garden; perfectly trimmed and weeded with rows of flowers and plants that you find desirable. This state of the garden is considered to have very low entropy as this is not a naturally occurring situation and you have manufactured order. Now imagine you let a month go by without impacting the garden. Odds are some weeds have sprouted throughout the area as well as the grass having grown quite a bit. Perhaps some unkempt bushes stick out over your flowers, crushing some in the process. Entropy has brought the area closer to its neutral state (that of the original landscape, be it prairie, woodland, etc.). Now let a year or two go by, and you may have difficulty seeing any semblance of a garden existing in the first place. People equate entropy with chaos but it is more akin to natural decay, with order disturbing the balance.

This process is the easiest to imagine entropy with, but it goes much further than that. The true neutral state for the universe is… nothingness. Eventually the Sun will burn out, the Earth’s and all the planets’ cores will cool to absolute zero, and all atoms will cease to move. This is the fate of an expanding universe.

Anonymous 0 Comments

Entropy.

Such a simple word, but so complex in what it is.

Okay. Let’s say I have a perfectly nice stack of blocks, stacked vertically, ordered by A, B, C, D, E, and so on. This is a state of low entropy.

Then I smashed the stack.

Now we see blocks scattered all over the floor, randomly. This is a state of higher entropy.

How do we get the stack back in order? We spend work to put it back, or we can reset the universe (or open up the menu and press Load).

Entropy is the same. You can spend work and try to reduce entropy, or simply go back in time.

Entropy as laymen can understand is a representation of disorderliness of a state of things in a particular space. It naturally goes up (unless you’re rewinding time). You have to put some work in so it comes back down.

(And whoever playing jumbo jengas at 2AM in a fucking loud party next to my apartment, you can go fuck yourself. The noise of jenga entropy is too damn loud.)

Anonymous 0 Comments

Entropy is NOT disorder, it’s often called that because entropy LEADS to disorder but they are not the same thing. Entropy is simply a statistic that arises because energy is more likely to spread out than to concentrate into a specific state.

For example, imagine you have 100 cups and a pitcher full of water. Now imagine you put drops of that water into the cups at random. What are the chances that *all* of the water ends up in 1 or 2 cups? It’s very slim, definitely possible, but very unlikely, what is more likely is that the water gets spread out between the cups evenly (relatively speaking of course, some cups will have more water than others). Each different way of filling the cups, for example, all the water in 1 cup, the water spread perfectly evenly between the cups, and anything in between, is called a microstate. Some microstates are more likely to occur than others (such as the water spread out evenly is more likely than the water in one cup).

Entropy is essentially just a way to measure which state is more likely. If a state is more likely its said to have high entropy. And that’s why entropy is said to always increase in a closed system, because the system will always evolve to a state that’s more likely.

Now in physics usually when people talk about enteopy they’re talking about energy. So instead of the water used in the previous analogy, it’s energy that gets distributed across the system, and instead of cups, it’s atoms and molecules and other particles and waves where the energy gets distributed in. Now one thing to remember is that there’s nothing in physics that says entropy HAS to increase, it’s just that entropy is extremely (and I can’t stress the extremely enough) likely to increase.

Edit: One example my professor gave that really resonated with me during the talk of entropy is that there is absolutely nothing in physics that is stopping all the air in the room you’re in from suddenly moving to one side and suffocating everyone in the other side. The only thing that keeps the air from doing that is probability. Its just incredibly unlikely that all the trillions of air molecules which have velocities that are more or less random would all randomly start to move in the same direction to the other side of the room, it’s much more likely that they all spread out relatively evenly in the room. That is entropy.

Anonymous 0 Comments

There are more ways for your room to be messy than for it to be clean. If there was an earthquake, it’s possible that your room would be perfectly put together at the end, but it’s more likely that it’ll be a mess. If it was a little bit messy before the quake, it’s probably more messy after. If there were another quake, it would get messier still, until your room is just so messy and disorderly that tossing things around in another earthquake doesn’t really make the situation any better or worse. Entropy has reached its maximum at that point.

Entropy is a measure of how many equally-messy states there are. There’s only one way for your room to be perfectly clean, and a few ways for it to be almost perfectly clean, but a ton of ways for it to be a complete disaster. Everything in chemistry moves with some element of randomness, like your belongings in an earthquake. It’s not impossible for that random action to tidy up your room, but it is very unlikely.