What does the concept of entropy mean?

614 views

What does the concept of entropy mean?

In: 96

28 Answers

Anonymous 0 Comments

You have three toy blocks. You stack them into a tower. You knock them over.

It took less energy to knock them over than to stack them.

Anonymous 0 Comments

A machine does a task. The task requires a transfer of a certain amount of energy to feed the machine. Depending on how well built the machine is designed and maintained it might do the task well or poorly. That is, do alot of work or a little for that amount of energy. Entropy tells you how much more work or energy you could have gotten out of your machine if you were as efficient as possible. Entropy is that amount of energy divided by the temperature of the space around the machine. Now replace “machine” with “any volume of space” and “task” with “any interaction at all between the space and its surroundings.”

Anonymous 0 Comments

Time moves in one direction, and that direction is dictated by the general increase in entropy. Entropy drives the universe to a state where any given space is occupied by an equal mixture of matter and energy.
As other answers have stated, you cannot unexplode a bomb. The energy has been released, and the effects have occurred.
Another way to say it is cream poured into coffee. It pours, blooms and fills the coffee. You cannot extract the cream from the coffee once it has been poured without a significant expenditure of energy.
Expending energy helps explain why locally order can increase, but generally everything proceeds to a homogeneous (a perfect mixture) state of disordered matter.

Anonymous 0 Comments

Please stop it with the chaos and disorder garbage.

Entropy is a measure of the energy gradient in a system. This can be any kind of energy, thermal potential, kinetic, chemical etc

Low entropy means a large gradient, so you have a system where parts have relatively high and low energy, and high entropy in a system means everything has about the same energy.

To extract energy from a system you need an energy gradient, so in a low entropy system you can extract a lot of energy, and in a high entropy system you cannot extract energy at all

Anonymous 0 Comments

If we’re going by thermodynamics (heat and energy science):

Entropy is heat loss. All processes lose a bit of heat due to imperfections. Everything loses heat in an environment that’s colder than it (or if it’s a reaction that releases heat), and outer space (most of the universe) has next to no heat. Thus, everything runs out of heat/energy eventually.

Anonymous 0 Comments

The way I always thought of entropy was that everything wants to lead to chaos if you let it, probably not the best idea but I feel entropy just makes it so if something, like energy can find its way outwards and into spaces without energy, it will, if something already has heat, the heat will want to get to the cold areas, like it is trying to fill as much space as it can? Not sure if that at all correct

Anonymous 0 Comments

Entropy is a statistical property that relates the microstates of a particular system (the exact state of every element in the system) to the system’s macrostate (the overall, high level state of the system).

A good example would to consider a set of 5 coins. The macrostate of 5 heads has only one microstate, 5 heads:
H,H,H,H,H

The macrostate of 4 heads has 5 possible microstates:

T,H,H,H,H

H,T,H,H,H

H,H,T,H,H

H,H,H,T,H

H,H,H,H,T

The macrostate of 3 heads has 10 possible microstates I believe:

T,T,H,H,H

T,H,T,H,H

T,H,H,T,H

T,H,H,H,T

H,T,T,H,H

H,T,H,T,H

H,T,H,H,T

H,H,T,T,H

H,H,T,H,T

H,H,H,T,T

The multiplicity of a particular macrostate is the number of microstates corresponding to the same macrostate. In the above examples, 5 heads has a multiplicity of 1, 4 heads has a multiplicity of 5, 3 heads has a multiplicity of 10, 2 heads will also have a multiplicity of 10, 1 heads has a multiplicity of 5, and 0 heads will have a multiplicity of 1. The entropy of a system is (Boltzmann’s constant multiplied by) the log of the multiplicity.

When you flip a number of coins, while you can’t make any prediction on the particular sequence you will get, you can be reasonably confident that you’re likely to get an even split of heads and tails, a high entropy state, compared to all heads or all tails, a low entropy state.

Anonymous 0 Comments

Let’s start with an easy-to-conceptualize model and work out from there.

Picture one of those plastic link clicky chain toys, like [this](https://i5.walmartimages.com/asr/b1066da1-a213-4153-9d9e-1e74aed4ac43.d6f545ff4d1793551235411f6a4cbe4f.jpeg). There’s 23 joints in the chain, and each can be clicked into one of five positions. There’s exactly one configuration that makes a completely straight line – all 23 joints are clicked into the “straight” position. By contrast, there are 46 different configurations where exactly one joint is bent into a 90-degree corner in either direction, 2 for each joint. Then, for each of those 45 configurations, there’s another 44 configurations where exactly two joints are bent at a 90-degree angle, which is over 2000 possibilities. Once you start getting into 3, 4, 10 links bent, the number quickly grows into the millions, billions, and beyond. So if someone were to walk up and click each joint into a random position, there’s only one possible outcome where the result is completely straight, while there are huge numbers of outcomes where the chain is bent somehow. This means that functionally, you’re basically never going to end up with a completely straight chain from just random bends. Entropy represents how likely a certain state is within the available space of all states. The rule goes “Within a given system, entropy always increases unless it’s acted on by an outside force.” Because of likelihoods, your chain is going to end up more bent over time until it reaches a point of maximum entropy. You can come in and manually straighten every link in the chain, decreasing entropy, but that requires you as an outside force to act on the system.

Let’s try a more complex example. Let’s say you have a glass jar. You pour in colored sand, from 8 different colors, one layer at a time, making a rainbow. Each layer contains millions of grains of sand. Your jar is going to start out neatly layered, but as it gets jostled, shaken, moved around, the grains will move around and blur the neat lines, until eventually you have a jar of indeterminate rainbow mush. In theory, there’s a tiny possibility that you could shake this mixed jar and the layers would separate out – but the possibility is so small that the chances of this happening randomly are effectively zero. You could use a microscope and an extreme precision tweezers to separate out the grains of sand again by hand, but it’s easy to see how the energy required to do this is extraordinarily high compared to the effort it took to jumble them together. The sorted, separated state has a much lower entropy than the jumbled state.

The fundamental particles that make up everything are an infinitely more complex version of grains of sand and chains of links. Without any outside interference, solid structures will break down, energy leaks out into the void of space, and you are eventually left with an indiscriminate mass of inert, diffuse particles, the highest entropy state.

So how is it that anything actually works? The key is in forces outside the system. The sun is a powerful force that puts out absolutely insane levels of energy out into the solar system. Our planet absorbs this energy through various means, and uses it to decrease entropy on a much smaller scale. The solar energy of our sun gets converted into energy in plants, which is used to build cellular structures. These plants are consumed by animals that break down those cells and use the energy to build their own cellular structures. Animals burn the energy stored in their cells to power their brains and muscles, exerting effort on the environment around them to craft tools, build buildings, create art, etc. Each one of these acts is a small fight against entropy. But on the loooong scale of things, the sun will eventually run out of fusion-able matter and burn out, Earth will stop absorbing this energy, whatever is left on the planet will decay and disperse into the universe, and the entropy of everything will increase. The end state of this is a theoretical state known as “the heat death” of the universe, wherein there are no systems that have localized decreases in entropy.

Anonymous 0 Comments

I think Brian Cox has good explanation in one of his shows.

If you build a sand castle is a desert it as highly ordered structure (low entropy) but over time it will revert back to just grains of sand (high entropy).

However over the the time the desert has existed the sand has never been randomly arranged as a sand castle. It is not impossible just very unlikely that would happen because naturally thing seems to go from low to high entropy.

Anonymous 0 Comments

Others have pointed out that it is a measure of the order/disorder in a system. But I think that is not quite getting to the point.

The point is that it is a way to know how much work can be done in a closed system.

First, it is easy to misunderstand “closed system”. This is a concept of an isolated hunk of stuff where nothing comes in, and nothing goes out. Well, there is no such thing other than (arguably) the entire universe.

For example, the Earth is sitting in space, not touching anything, but is not closed because energy comes from the sun as radiation particles. And the Earth radiates lower heat energy, and has particles stripped away.

So this is more of a thought experiment, and in practice, a way to find inputs to a system when those inputs may not be obvious.

So… As a thought experiment: if you make a closed system out of the earth, then the plant life will all die from lack of light, then everything else will die in a chain reaction. That is not the interesting bit. Over time, the crust of the Earth would heat up and melt. Why? In Earth’s current state, the core is the hottest due to several processes, and the crust is cool because it is able to radiate heat to space. In a “closed system Earth”, Earth can no longer radiate out heat, so the entire Earth becomes the same temperature over time. And that temperature is high enough to melt the crust.

This is more interesting than it might seem at first. Most power generation has to do with temperature differences. The simple example is a steam engine. It is the concentrated heat on one side that provides the energy. In our “closed system Earth”, one the heat has equalized, there is much less opportunity to create such engines. The Earth would be in a higher entropy state. Less work can be accomplished.

In the real world, we can literally dump water into a deep hole, have earth heat the water to steam, and use the steam to run a generator. This is a lower entropy state. More work can be accomplished.