Entropy, please oh god please.

806 views

I have read COUNTLESS explanations, countless examples, countless ELI5s. But I still have no fucking idea what entropy is. I’m **not** leaving this earth before I understand what it is. Thank you.

In: Physics

16 Answers

Anonymous 0 Comments

So I am going to try to explain this in a slightly different way. Forget everything you’ve read about this.

First, entropy is an invented concept. It isn’t something physical. It is an idea, that happens to be very useful in describing our world/universe. This idea is very useful applied to how energy works in the physical world , and how information works in abstract.

Now think about something happening – anything…a chemical reaction, something moving from point A to point B…and egg cracking as it falls on the floor…anything you want. How do you determine if this ‘action’ or ‘event’ is ****even possible* ?***

That depends on if the energy put into to the system is more than the energy required to make the thing happen. Makes sense right? And of course *how fast* this thing happens depends on how big that difference in the energy is.

When you drop an egg on the floor and it collides… the kinetic energy of the egg is MORE than the bonding energy between the molecules of the egg shell keeping it together. Therefore the egg cracks. The more is the kinetic energy, the faster the egg cracks.

Well, not quite. It turns out that this is not enough. Apparently, there is an additional contribution. The egg shell contains molecules that are bonded together in a particular space. And really…why should they remain in that place? They just want to jiggle…because they have an internal energy. They have a certain temperature. So their bonding energy has to counter their tendency to simple jiggle away. They don’t want to be structured ..they want to randomly float away.

When the egg drops…the bonding energy is countered by two different energies – the kinetic energy of the egg PLUS the energy due to the egg molecules tendency to jiggle away at that temperature. This tendency to jiggle away into randomness is **entropy**. You can think of this as following – equilibrium means everything and everywhere is equal. Everypoint in space should be the same. The universe tends to equilibrium. But ANY kind of structure in anything creates a difference in two points of space. The amount of egg stuff within the egg is different from the amount of egg stuff outside the egg. Maintaining this structure requires energy. The hotter (jigglier) things are, the more energy is required to maintain this structure. This additional ***energy that the intact structure has*** per degree of temperature is compensating an energy debt . That compensating energy credit is **entropy**.

If you go really really abstract… what it means is the following : It requires energy to be only a certain specific way, when there are many many millions of ways possible to be. Applies to everything – there are infinitely many ways an egg can exist as being cracked, but only one way it can exist intact. Remaining intact is an **energy credit**.

However many way there are of being alive, there are vastly many ways of being dead. Life is forever in energy credit…which it must compensate by generating energy (from food for example). When you eat…you exist because your energy credit came at the expense of the destruction of the energy credit of food. You exist, because you food no longer does – the arrangement of a plant or an animal was destroyed, it entropy increased, and the resulting energy was used to fund the energy credit of your structure.

However many ways, you can arrange letters , there are only a few ways to arrange them such that they make sense. That sensible information is energy credit to counter *information entropy*. This is what was found out by Shannon.

When we talk of physical things, we are talking of ‘thermodynamic entropy’. When we are talking of patterns in the abstract…like letters or the series of 1s and 0s in a computer, we are talking information entropy. Turns out that whenever we encode information in a physical medium – the internet is in the end , electrons moving around in silicon, the “information entropy” manifests as “thermodynamic entropy” .

And finally, because the universe has a temperature , a basic jiggliness, everything always wants to jiggle away into randomness. Randomness is the only true equilibrium….all patterns are an energy credit, and the moment there is a chance , the universe wants to take that credit away….entropy always increases on the whole. That is the second law of thermodynamics.

TL;DR: For any non-absolute zero temperature, physical objects have a tendency to jiggle away. The energy required degree of temperature to counter this specific tendency to jiggle away per to randomness is the entropy.

For abstract patterns, analogy…smashing away on the keyboard randomly creates a string of letters – say 7 letters ‘ptryneo’. Another random set of the 7 letters .’reponty’ requires the same amount of energy .ie. 7 keystrokes’ worth.

But to make them spell ‘entropy’ , you need 7 keystrokes energy + the additional energy equivalent of arranging them in a specific order so that they make sense…and contain information. is **entropy**. This *additional* energy requirement is entropy.

Rather than saying entropy is the amount of ‘disorder’. It is more accurate to say “entropy” is the energetic cost of creating order from randomness.

Anonymous 0 Comments

While there are some great answers here, as you’ve probably figured out entropy is one of those concepts that just doesn’t really lend itself to a simple explanation. In my experience so far, Brian Greene’s book *Until the End of Time* has one of the better explanations of the concept I’ve read so far, as well as some interesting theories on what the entropic state of the universe might look like billions and trillions of years down the road.

Anonymous 0 Comments

Everything turns to dust and dies, given enough time.

Eventually everything will just be one big blob of identical dust, and nothing will change ever again.

Anonymous 0 Comments

Entropy basically means “time goes forward” or “what is done cannot be undone”

Maybe that’s too simplistic, let me try again.

Ok, you know your big brother who’s a teenager? Yes, I love him too. Do you remember how on Saturday his room was all tidy? But as time goes on it slowly gets messier? If we left it alone long enough it would just be chaos in there, a meaningless arrangement of all his stuff.

That’s sorta like entropy, where over time any closed system tends towards chaos.

What’s that you say? His room doesn’t just get more messy? That’s right, because your mom or I make him clean up once in a while. You could think of our intervention as external energy being applied to the system.

Anonymous 0 Comments

I’ve been reading responses for 10 minutes and I understand entropy less now than when I got here. 😅

Anonymous 0 Comments

You can see the entropy of a system as the amount of information you gain by looking at the system (measured in bits).

Let’s examine the entropy of a single switch. If you know that this switch is stuck in the ON position, you will gain zero information from looking at it. So the entropy is 0.

If now the switch can be both ON and OFF, its entropy is not 0 because you cannot know with certainty its position before looking at it. So what is its value? Well it depends. Imagine you know that 2/3 of the times it’s in the ON position. Then you have a better chance at guessing its position as if it was 50-50. Thus, you gain less information by looking at the switch if chances are 2/3-1/3 or 1/3-2/3 than if they are 50-50 (1/2-1/2). So the entropy of the switch is maximal when the odds are even, ie when it can be in each state with equal probability.

Now consider a system of 100 switches. It can be in 2x2x2x… 100 times states, or 2^100. The number of states is immensely more, but the same rule holds: entropy is maximal when the odds are equal between all states. Or course this maximal is a lot higher than for a single switch because you need to take information on 100 switches (actually it’s simply 100x as much). The universe is the same, just with a lot more degrees of freedom.

You perhaps know the second law of thermodynamics: entropy always increases. This means that any physical system tends towards equalizing the odds. Physically, equal odds correspond to homogeneous systems. That’s why ink irreversibly diffuses in water: it’s pushed by entropy.

About heat: heat makes particles more agitated. They move around more. Let’s say particle A can be in position 1, 2 and 3. When it’s cold it’s gonna be in position 2 90% of the time and in position 1 and 3 5% of the time each. When it’s hot it’s gonna be 40% of the time in 2 and 30% of the time each in 1 and 3. So heat pushes towards equal odds. A consequence is that the second law makes heat easy to create but hard to destroy. You can convert electrical or mechanical energy to heat with almost perfect efficiency, but the opposite direction is much less efficient (typically 30-50%).

Final point: making a cold object 10° warmer increases entropy more than making a hot object (of the same mass) 10° warmer. In terms of entropy creation, there are diminishing returns with temperature. That’s why if you put a hot and a cold object in contact, physics will dictate that the heat flows from the hot object to the cold one until they reach equilibrium. Cooling down the hot object destroys less entropy than is created by warming up the cold one.

Anonymous 0 Comments

Entropy refers to disorder/energy the higher the entropy the greater the disorder. A big thing to remember is that our definition of chaos and order is different than the thermodynamics definition to us order is everything being even but to thermodynamics it’s everything being concentrated on one point. Everything moves towards disorder but it all wants to be the same level of disorder.

Think about a glass of water that’s all the same temperature at room temp it has a state of high entropy it is disorder. But if you drop an ice cube in the cup all the difference in temperature is organized and consolidated into one point and therefore has low entropy and is orderly but since all things move to disorder eventually the ice will melt and given enough time all the water will be the same temperature and eventually the temperature would seep into the air making the water room temperature making it high entropy.

Basically the universe likes stuff to be the same

Anonymous 0 Comments

Did you ever get an answer that helped you understand it? I am not a physicist, but one of my close friends is and they explained very clearly what entropy ‘is’. If you are still confused, let me know and I’ll type out my understanding.

Anonymous 0 Comments

(For the record: I don’t understand entropy, and I’ve kind of made my peace with it. I 100% don’t get how information has entropy.)

Part of the reason entropy is hard to teachis that it started as people tried to figure out steam engines (hence a bunch of heat-related explanations), tied to a bunch of woo-woo but also grounded in massive, giant entropy tables for steam. Add in the ability of computers to do massive statistics, and the love of physicists to tie seemingly-unrelated branches of knowledge together, and you get kind of a mess.

Not only is it hard, but it’s hard with a long history.

Anonymous 0 Comments

I’ll give you the answer that helped me as a physics undergraduate.

Entropy is number of microstates corresponding to a particular macrostate. If you’ve not studied physics, that probably means nothing to you, so here’s an example.

Imagine a cup of coffee. It’s filled with coffee and milk mixed together. Let’s say that 80% of the particles in the cup are coffee particles and 20% is milk.

There are certain number of ways you can arrange those particles of milk and coffee in the cup. This is the microstate. If you could somehow break it down and see exactly where each particle is in the mixture, you’d know its microstate.

What do you actually see? You see a cup where the coffee and milk are well mixed together. Even if you don’t stir it, they’ll eventually look like that. That’s the macrostate–the overall state of the system.

An alternative macrostate would be to have the milk all in one layer on top of the coffee, not mixed together at all. There is nothing physically preventing this from happening. But there are only a few ways to arrange the coffee and milk particles that would produce this result. If you could take that cup full of milk and coffee particles and randomise their positions, on almost every randomisation you’d end up with a cup where the particles are evenly mixed. Because there are far more possibilities that produce that result.

That’s what people mean when talking about entropy as disorder. The more ‘disordered’ state has all the particles mixed up, while having them cleanly separated is ‘ordered’. The disordered state has a higher entropy, because there are more ways you can arrange the particles to get that result.

That’s why, if you pour some milk in a cup and leave it, it’ll eventually mix together on its own. Stirring just speeds it up.