298 views

I have read COUNTLESS explanations, countless examples, countless ELI5s. But I still have no fucking idea what entropy is. I’m **not** leaving this earth before I understand what it is. Thank you.

In: Physics

I hope you get a decent ELI5 answer so I can hopefully understand what it is. My loose grasp of entropy is the transition from order to chaos.

Over time, shit changes to the temperature of the things next to them. This is true for literally everything, so the temperature of the earth is slowly equalizing to the temperature of the space next to it ( which is very technically not a vacuum since there’s tiny particles in it).

“Entropy” is that idea. Things slowly lose their energy over time until the system has equal energy in it, and then nothing can change since it’s perfectly equal.

“Heat Death”, often referred to along with entropy (so i’m guessing you want an explanation for that too) is when literally everything over trillions of years perfectly equalizes it’s heat, and then perfectly equalizes all of it’s other forms of energy in the form of heat because heat is bled off every time there’s an action (usually in the form of friction). Eventually, literally everything will be an evenly distributed evenly “room temperature” blob, and because there’s literally no potential energy left literally nothing more can happen.

*edit: entropy is also a word which can be used when not referring to the physics concept, and in that case it simply means the general concept of stuff eventually falling apart and becoming useless.

It’s a difficult concept and really due to energy dispersal and statistical likelihoods.

This is one of the easiest to understand videos on the topic I have found.

https://youtu.be/YM-uykVfq_E

It won’t make you an expert, but it’ll give a better idea of what entropy actually is.

Basically: Entropy is the possible number of permutations a system can exist in.

It’s not as simple as that really, there’s a lot of thermodynamics and quantum theory involved, and there’s actually a theory that rather than average entropy increasing over time that time increases with the average entropy of the universe, but at then end of the day the simplest way to put it is that things want to have as much freedom as possible.

Eveything from position, orientation, kinetic energy of the individual molecules, distance between the molecules, those all contribute to the entropy of the system.

Consider a system that contains four particles. They each have two states, let’s call them high energy and low energy. We’ll represent this with 0 for low and 1 for high if you want. What are the possible states?

* 0000
* 0001
* 0010
* 0011
* 0100
* 0101
* 0110
* 0111
* 1000
* 1001
* 1010
* 1011
* 1100
* 1101
* 1110
* 1111

So with four particles, and two options each, we have 16 possible states. One thing you can immediately notice, the ordered states are rare. There’s just 0000 and 1111. 2 of 16.

So what is entropy? Literally that. It’s how many possible states there are, and how rare a given one is among them. The 0000 is low entropy. One like 1010 is just as rare, but it has many similar looking ones that together are high entropy. Just replace these the digits with atoms, and the two possible states with much more degrees of freedoms.

If you have a few billion, billion, billion atoms, there’s a lot of ways you could arrange them. A perfect crystal has a few options, but not many. An imperfect solid has a lot more options. A hot gas has way, way, way more options. So there’s just more ways to arrange a bunch of atoms that macroscopically look like a hot gas than there is that macroscopically looks like a perfect crystal. Just as an ordered number like 11111111 is rarer than a messy one like 10111010 if you were to randomly draw the numbers from a hat. And this is why entropy increases. If something changes, it’s way, way more likely to go to a messy state than an ordered one. It’s not impossible, but highly, highly unlikely.

Now you may be saying, my example looks an awful lot like binary numbers. And it is. Entropy and information are really the same thing. Ordering of states. You can measure entropy in bits. But the information we use on a computer is more abstract and of a much, much, much lower scale than the thermodynamic entropy of matter. Better to use J/K, the thermodynamic unit. Now, you may think that looks like some sort of relationship between energy (Joules) and temperature (Kelvin). But let’s roll this back for a second, and consider that temperature is actually the relationship, not entropy. Rather than entropy = energy/temperature. Consider that temperature = energy / entropy. Temperature is a relationship between changes in entropy with changes in energy. How does adding more energy change the possible states. That is temperature.

Entropy is such a complex topic and its uses range from classic thermodynamics to statistical mechanics… but let me try to paint you a picture

A car engine works by converting thermal energy into mechanical energy. The process works with some kind of combustion that violently pushes the pistons and hence make them move. Thermal energy -> mechanical energy.

The first law says you cannot get more energy out than you input. That means the max mechanical energy you can get is either lower or equal to the energy put into combustion.

Furthermore, the second law (which is about entropy) states that you cannot even break even. That means that in reality you can NEVER get as much mechanical energy out for a certain energy input. Crazy, right? You’d expect the maximum efficiency of an engine to be 100%, but in reality you can’t even reach that number because gasoline engines, for instance, are theoretically only capable of being 50% efficient. (You can mathematically prove this) And that’s under the best conditions. We still only achieve efficiencies of 30% in the real world. Pretty crap right?

Real life engines are irreversible processes. That means you cannot reverse the process (duh). In irreversible processes, entropy always increases or stays the same. Thus for a car engine, realistically, the entropy will always increase between combustion and mechanical work. In a way, entropy helps us understand when systems play in “forward time” or “reverse time”.

And that’s just in thermodynamics. Statistical mechanics explains entropy with probabilistic distributions and stuff. Anyway entropy is really fucking complicated, so you shouldn’t be obsessed with learning everything about it unless you want to become a professor.

I’ll give you the answer that helped me as a physics undergraduate.

Entropy is number of microstates corresponding to a particular macrostate. If you’ve not studied physics, that probably means nothing to you, so here’s an example.

Imagine a cup of coffee. It’s filled with coffee and milk mixed together. Let’s say that 80% of the particles in the cup are coffee particles and 20% is milk.

There are certain number of ways you can arrange those particles of milk and coffee in the cup. This is the microstate. If you could somehow break it down and see exactly where each particle is in the mixture, you’d know its microstate.

What do you actually see? You see a cup where the coffee and milk are well mixed together. Even if you don’t stir it, they’ll eventually look like that. That’s the macrostate–the overall state of the system.

An alternative macrostate would be to have the milk all in one layer on top of the coffee, not mixed together at all. There is nothing physically preventing this from happening. But there are only a few ways to arrange the coffee and milk particles that would produce this result. If you could take that cup full of milk and coffee particles and randomise their positions, on almost every randomisation you’d end up with a cup where the particles are evenly mixed. Because there are far more possibilities that produce that result.

That’s what people mean when talking about entropy as disorder. The more ‘disordered’ state has all the particles mixed up, while having them cleanly separated is ‘ordered’. The disordered state has a higher entropy, because there are more ways you can arrange the particles to get that result.

That’s why, if you pour some milk in a cup and leave it, it’ll eventually mix together on its own. Stirring just speeds it up.

(For the record: I don’t understand entropy, and I’ve kind of made my peace with it. I 100% don’t get how information has entropy.)

Part of the reason entropy is hard to teachis that it started as people tried to figure out steam engines (hence a bunch of heat-related explanations), tied to a bunch of woo-woo but also grounded in massive, giant entropy tables for steam. Add in the ability of computers to do massive statistics, and the love of physicists to tie seemingly-unrelated branches of knowledge together, and you get kind of a mess.

Not only is it hard, but it’s hard with a long history.

Did you ever get an answer that helped you understand it? I am not a physicist, but one of my close friends is and they explained very clearly what entropy ‘is’. If you are still confused, let me know and I’ll type out my understanding.

Entropy refers to disorder/energy the higher the entropy the greater the disorder. A big thing to remember is that our definition of chaos and order is different than the thermodynamics definition to us order is everything being even but to thermodynamics it’s everything being concentrated on one point. Everything moves towards disorder but it all wants to be the same level of disorder.

Think about a glass of water that’s all the same temperature at room temp it has a state of high entropy it is disorder. But if you drop an ice cube in the cup all the difference in temperature is organized and consolidated into one point and therefore has low entropy and is orderly but since all things move to disorder eventually the ice will melt and given enough time all the water will be the same temperature and eventually the temperature would seep into the air making the water room temperature making it high entropy.

Basically the universe likes stuff to be the same

You can see the entropy of a system as the amount of information you gain by looking at the system (measured in bits).

Let’s examine the entropy of a single switch. If you know that this switch is stuck in the ON position, you will gain zero information from looking at it. So the entropy is 0.

If now the switch can be both ON and OFF, its entropy is not 0 because you cannot know with certainty its position before looking at it. So what is its value? Well it depends. Imagine you know that 2/3 of the times it’s in the ON position. Then you have a better chance at guessing its position as if it was 50-50. Thus, you gain less information by looking at the switch if chances are 2/3-1/3 or 1/3-2/3 than if they are 50-50 (1/2-1/2). So the entropy of the switch is maximal when the odds are even, ie when it can be in each state with equal probability.

Now consider a system of 100 switches. It can be in 2x2x2x… 100 times states, or 2^100. The number of states is immensely more, but the same rule holds: entropy is maximal when the odds are equal between all states. Or course this maximal is a lot higher than for a single switch because you need to take information on 100 switches (actually it’s simply 100x as much). The universe is the same, just with a lot more degrees of freedom.

You perhaps know the second law of thermodynamics: entropy always increases. This means that any physical system tends towards equalizing the odds. Physically, equal odds correspond to homogeneous systems. That’s why ink irreversibly diffuses in water: it’s pushed by entropy.

About heat: heat makes particles more agitated. They move around more. Let’s say particle A can be in position 1, 2 and 3. When it’s cold it’s gonna be in position 2 90% of the time and in position 1 and 3 5% of the time each. When it’s hot it’s gonna be 40% of the time in 2 and 30% of the time each in 1 and 3. So heat pushes towards equal odds. A consequence is that the second law makes heat easy to create but hard to destroy. You can convert electrical or mechanical energy to heat with almost perfect efficiency, but the opposite direction is much less efficient (typically 30-50%).

Final point: making a cold object 10° warmer increases entropy more than making a hot object (of the same mass) 10° warmer. In terms of entropy creation, there are diminishing returns with temperature. That’s why if you put a hot and a cold object in contact, physics will dictate that the heat flows from the hot object to the cold one until they reach equilibrium. Cooling down the hot object destroys less entropy than is created by warming up the cold one.

I’ve been reading responses for 10 minutes and I understand entropy less now than when I got here. 😅

Entropy basically means “time goes forward” or “what is done cannot be undone”

Maybe that’s too simplistic, let me try again.

Ok, you know your big brother who’s a teenager? Yes, I love him too. Do you remember how on Saturday his room was all tidy? But as time goes on it slowly gets messier? If we left it alone long enough it would just be chaos in there, a meaningless arrangement of all his stuff.

That’s sorta like entropy, where over time any closed system tends towards chaos.

What’s that you say? His room doesn’t just get more messy? That’s right, because your mom or I make him clean up once in a while. You could think of our intervention as external energy being applied to the system.

Everything turns to dust and dies, given enough time.

Eventually everything will just be one big blob of identical dust, and nothing will change ever again.

While there are some great answers here, as you’ve probably figured out entropy is one of those concepts that just doesn’t really lend itself to a simple explanation. In my experience so far, Brian Greene’s book *Until the End of Time* has one of the better explanations of the concept I’ve read so far, as well as some interesting theories on what the entropic state of the universe might look like billions and trillions of years down the road.

First, entropy is an invented concept. It isn’t something physical. It is an idea, that happens to be very useful in describing our world/universe. This idea is very useful applied to how energy works in the physical world , and how information works in abstract.

&#x200B;

Now think about something happening – anything…a chemical reaction, something moving from point A to point B…and egg cracking as it falls on the floor…anything you want. How do you determine if this ‘action’ or ‘event’ is ****even possible* ?***

That depends on if the energy put into to the system is more than the energy required to make the thing happen. Makes sense right? And of course *how fast* this thing happens depends on how big that difference in the energy is.

When you drop an egg on the floor and it collides… the kinetic energy of the egg is MORE than the bonding energy between the molecules of the egg shell keeping it together. Therefore the egg cracks. The more is the kinetic energy, the faster the egg cracks.

Well, not quite. It turns out that this is not enough. Apparently, there is an additional contribution. The egg shell contains molecules that are bonded together in a particular space. And really…why should they remain in that place? They just want to jiggle…because they have an internal energy. They have a certain temperature. So their bonding energy has to counter their tendency to simple jiggle away. They don’t want to be structured ..they want to randomly float away.

When the egg drops…the bonding energy is countered by two different energies – the kinetic energy of the egg PLUS the energy due to the egg molecules tendency to jiggle away at that temperature. This tendency to jiggle away into randomness is **entropy**. You can think of this as following – equilibrium means everything and everywhere is equal. Everypoint in space should be the same. The universe tends to equilibrium. But ANY kind of structure in anything creates a difference in two points of space. The amount of egg stuff within the egg is different from the amount of egg stuff outside the egg. Maintaining this structure requires energy. The hotter (jigglier) things are, the more energy is required to maintain this structure. This additional ***energy that the intact structure has*** per degree of temperature is compensating an energy debt . That compensating energy credit is **entropy**.

&#x200B;

If you go really really abstract… what it means is the following : It requires energy to be only a certain specific way, when there are many many millions of ways possible to be. Applies to everything – there are infinitely many ways an egg can exist as being cracked, but only one way it can exist intact. Remaining intact is an **energy credit**.

However many way there are of being alive, there are vastly many ways of being dead. Life is forever in energy credit…which it must compensate by generating energy (from food for example). When you eat…you exist because your energy credit came at the expense of the destruction of the energy credit of food. You exist, because you food no longer does – the arrangement of a plant or an animal was destroyed, it entropy increased, and the resulting energy was used to fund the energy credit of your structure.

However many ways, you can arrange letters , there are only a few ways to arrange them such that they make sense. That sensible information is energy credit to counter *information entropy*. This is what was found out by Shannon.

When we talk of physical things, we are talking of ‘thermodynamic entropy’. When we are talking of patterns in the abstract…like letters or the series of 1s and 0s in a computer, we are talking information entropy. Turns out that whenever we encode information in a physical medium – the internet is in the end , electrons moving around in silicon, the “information entropy” manifests as “thermodynamic entropy” .

And finally, because the universe has a temperature , a basic jiggliness, everything always wants to jiggle away into randomness. Randomness is the only true equilibrium….all patterns are an energy credit, and the moment there is a chance , the universe wants to take that credit away….entropy always increases on the whole. That is the second law of thermodynamics.

&#x200B;

TL;DR: For any non-absolute zero temperature, physical objects have a tendency to jiggle away. The energy required degree of temperature to counter this specific tendency to jiggle away per to randomness is the entropy.

For abstract patterns, analogy…smashing away on the keyboard randomly creates a string of letters – say 7 letters ‘ptryneo’. Another random set of the 7 letters .’reponty’ requires the same amount of energy .ie. 7 keystrokes’ worth.

But to make them spell ‘entropy’ , you need 7 keystrokes energy + the additional energy equivalent of arranging them in a specific order so that they make sense…and contain information. is **entropy**. This *additional* energy requirement is entropy.

Rather than saying entropy is the amount of ‘disorder’. It is more accurate to say “entropy” is the energetic cost of creating order from randomness.