What does the concept of entropy mean?

649 views

What does the concept of entropy mean?

In: 96

28 Answers

Anonymous 0 Comments

Entropy.

Such a simple word, but so complex in what it is.

Okay. Let’s say I have a perfectly nice stack of blocks, stacked vertically, ordered by A, B, C, D, E, and so on. This is a state of low entropy.

Then I smashed the stack.

Now we see blocks scattered all over the floor, randomly. This is a state of higher entropy.

How do we get the stack back in order? We spend work to put it back, or we can reset the universe (or open up the menu and press Load).

Entropy is the same. You can spend work and try to reduce entropy, or simply go back in time.

Entropy as laymen can understand is a representation of disorderliness of a state of things in a particular space. It naturally goes up (unless you’re rewinding time). You have to put some work in so it comes back down.

(And whoever playing jumbo jengas at 2AM in a fucking loud party next to my apartment, you can go fuck yourself. The noise of jenga entropy is too damn loud.)

Anonymous 0 Comments

Entropy is the inherent randomness/chaos in a system that tends to bring it to a neutral state.

Imagine a garden; perfectly trimmed and weeded with rows of flowers and plants that you find desirable. This state of the garden is considered to have very low entropy as this is not a naturally occurring situation and you have manufactured order. Now imagine you let a month go by without impacting the garden. Odds are some weeds have sprouted throughout the area as well as the grass having grown quite a bit. Perhaps some unkempt bushes stick out over your flowers, crushing some in the process. Entropy has brought the area closer to its neutral state (that of the original landscape, be it prairie, woodland, etc.). Now let a year or two go by, and you may have difficulty seeing any semblance of a garden existing in the first place. People equate entropy with chaos but it is more akin to natural decay, with order disturbing the balance.

This process is the easiest to imagine entropy with, but it goes much further than that. The true neutral state for the universe is… nothingness. Eventually the Sun will burn out, the Earth’s and all the planets’ cores will cool to absolute zero, and all atoms will cease to move. This is the fate of an expanding universe.

Anonymous 0 Comments

One way to think about entropy is from statistical point of view. This basically says that if there are more ways of arranging things which result in the same outcome, the entropy is higher. So for example you have 10 balls, 5 identical black and 5 identical white. You arrange them in a row. There are exactly 2 ways to arrange them in an alternating way. Also only 2 ways of having 5 of the same color touching. But let’s say I want 5 black balls always grouped and the white balls wherever. There are exactly 5 ways to arrange that, therefore the state of “5 black balls touching” has higher entropy than “balls alternating”. And so on.

Another example is 2 die system. When you roll them, you can achieve combined result of 7 in many ways, while for results of 2 and 12 there is only one way for each of them.

In a physical world, imagine a perfect crystal. So the atoms are ideally arranged where they should be. There is only 1 way of doing that therefore the lowest entropy. Nature does not like low entropy, so in real world this is hard or impossible to achieve, depending on the scale. Therefore, there are always some defects to that structure which introduce some entropy, which in turn lowers the overall energy of the system. What I am trying to describe here is more or less thermodynamic potential, called Gibbs free energy if you want more reading.

Anonymous 0 Comments

You can set off a stick of dynamite. But you can’t put it back together again. The amount of disorder in the universe always increases.

Well, technically you could take all of the gasses etc produced in the explosion, refine them, process them, and after a lot of work end up with a stick of dynamite again. But by introducing a lot of order to that random swirl of gasses, you will introduce even more disorder in the rest of the universe – e.g. you’re burning stuff to generate electricity to power the machines, you likely have used pure chemicals as part of your reactions and now you have a load of mixed chemical waste, etc. The amount of disorder in the universe always increases.

The “disorder” in this example is technically called “entropy”.

Anonymous 0 Comments

It basically describes the observed tendecy of physical systems to disappate energy over time through various random effects, and thus imperically disproving a type of perpetual motion machine.

To give an easier to understand analogy think of a card pyramid that is low entropy system because the mechanical stress each card expiriences (analog for energy) is different, now let’s say that due to vibrations or drafts (analog for various effects) that pyramid crumbles, now that is a high entropy system because more or less each card now expiriences the same level of mechanical stress.

The reason entropy is sometimes refer to as randomness is a same energy system is a lot harder to predict because each element in that system can move and jitter on it’s own; to go back to the analogy; there aren’t many ways to build a card pyramid, but the crumble pattern is always unique.

Anonymous 0 Comments

A kettle full of boiling water and a glass of ice water both will become lukewarm with enough time.

Anonymous 0 Comments

All information in the universe becoming less orderly over time, carbon reacts, atoms decay etc , in early universe was all the same type of matter and now… Complex. Very difficult if not impossible to put it back together the way it was

Anonymous 0 Comments

The most intuitive way to think of entropy is to consider it as a measure of how disordered a system is.

For example, consider a fresh pack of cards. It’s in one order and one order only, and it’s very easy to tell when the deck is no longer in that order. The fresh pack has very low entropy, because there’s only the one arrangement it can be in.

Now, if you shuffle the deck, so that the cards are completely randomized, you’ve raised the entropy of the deck. You can rearrange the individual cards very freely without damaging your ability to say “Yes, that’s a shuffled pack of cards.”

Another way to think of entropy is the ability to pull useful work out of a system. For example, you need a temperature difference to do any work with a heat engine; if there’s no gradient, nothing’s going to want to move from point A to point B. You need *low entropy*, a condition of order and being able to say “This is different than that,” in order to perform work.