(For the record: I don’t understand entropy, and I’ve kind of made my peace with it. I 100% don’t get how information has entropy.)
Part of the reason entropy is hard to teachis that it started as people tried to figure out steam engines (hence a bunch of heat-related explanations), tied to a bunch of woo-woo but also grounded in massive, giant entropy tables for steam. Add in the ability of computers to do massive statistics, and the love of physicists to tie seemingly-unrelated branches of knowledge together, and you get kind of a mess.
Not only is it hard, but it’s hard with a long history.
Entropy basically means “time goes forward” or “what is done cannot be undone”
Maybe that’s too simplistic, let me try again.
Ok, you know your big brother who’s a teenager? Yes, I love him too. Do you remember how on Saturday his room was all tidy? But as time goes on it slowly gets messier? If we left it alone long enough it would just be chaos in there, a meaningless arrangement of all his stuff.
That’s sorta like entropy, where over time any closed system tends towards chaos.
What’s that you say? His room doesn’t just get more messy? That’s right, because your mom or I make him clean up once in a while. You could think of our intervention as external energy being applied to the system.
While there are some great answers here, as you’ve probably figured out entropy is one of those concepts that just doesn’t really lend itself to a simple explanation. In my experience so far, Brian Greene’s book *Until the End of Time* has one of the better explanations of the concept I’ve read so far, as well as some interesting theories on what the entropic state of the universe might look like billions and trillions of years down the road.
So I am going to try to explain this in a slightly different way. Forget everything you’ve read about this.
First, entropy is an invented concept. It isn’t something physical. It is an idea, that happens to be very useful in describing our world/universe. This idea is very useful applied to how energy works in the physical world , and how information works in abstract.
Now think about something happening – anything…a chemical reaction, something moving from point A to point B…and egg cracking as it falls on the floor…anything you want. How do you determine if this ‘action’ or ‘event’ is ****even possible* ?***
That depends on if the energy put into to the system is more than the energy required to make the thing happen. Makes sense right? And of course *how fast* this thing happens depends on how big that difference in the energy is.
When you drop an egg on the floor and it collides… the kinetic energy of the egg is MORE than the bonding energy between the molecules of the egg shell keeping it together. Therefore the egg cracks. The more is the kinetic energy, the faster the egg cracks.
Well, not quite. It turns out that this is not enough. Apparently, there is an additional contribution. The egg shell contains molecules that are bonded together in a particular space. And really…why should they remain in that place? They just want to jiggle…because they have an internal energy. They have a certain temperature. So their bonding energy has to counter their tendency to simple jiggle away. They don’t want to be structured ..they want to randomly float away.
When the egg drops…the bonding energy is countered by two different energies – the kinetic energy of the egg PLUS the energy due to the egg molecules tendency to jiggle away at that temperature. This tendency to jiggle away into randomness is **entropy**. You can think of this as following – equilibrium means everything and everywhere is equal. Everypoint in space should be the same. The universe tends to equilibrium. But ANY kind of structure in anything creates a difference in two points of space. The amount of egg stuff within the egg is different from the amount of egg stuff outside the egg. Maintaining this structure requires energy. The hotter (jigglier) things are, the more energy is required to maintain this structure. This additional ***energy that the intact structure has*** per degree of temperature is compensating an energy debt . That compensating energy credit is **entropy**.
If you go really really abstract… what it means is the following : It requires energy to be only a certain specific way, when there are many many millions of ways possible to be. Applies to everything – there are infinitely many ways an egg can exist as being cracked, but only one way it can exist intact. Remaining intact is an **energy credit**.
However many way there are of being alive, there are vastly many ways of being dead. Life is forever in energy credit…which it must compensate by generating energy (from food for example). When you eat…you exist because your energy credit came at the expense of the destruction of the energy credit of food. You exist, because you food no longer does – the arrangement of a plant or an animal was destroyed, it entropy increased, and the resulting energy was used to fund the energy credit of your structure.
However many ways, you can arrange letters , there are only a few ways to arrange them such that they make sense. That sensible information is energy credit to counter *information entropy*. This is what was found out by Shannon.
When we talk of physical things, we are talking of ‘thermodynamic entropy’. When we are talking of patterns in the abstract…like letters or the series of 1s and 0s in a computer, we are talking information entropy. Turns out that whenever we encode information in a physical medium – the internet is in the end , electrons moving around in silicon, the “information entropy” manifests as “thermodynamic entropy” .
And finally, because the universe has a temperature , a basic jiggliness, everything always wants to jiggle away into randomness. Randomness is the only true equilibrium….all patterns are an energy credit, and the moment there is a chance , the universe wants to take that credit away….entropy always increases on the whole. That is the second law of thermodynamics.
TL;DR: For any non-absolute zero temperature, physical objects have a tendency to jiggle away. The energy required degree of temperature to counter this specific tendency to jiggle away per to randomness is the entropy.
For abstract patterns, analogy…smashing away on the keyboard randomly creates a string of letters – say 7 letters ‘ptryneo’. Another random set of the 7 letters .’reponty’ requires the same amount of energy .ie. 7 keystrokes’ worth.
But to make them spell ‘entropy’ , you need 7 keystrokes energy + the additional energy equivalent of arranging them in a specific order so that they make sense…and contain information. is **entropy**. This *additional* energy requirement is entropy.
Rather than saying entropy is the amount of ‘disorder’. It is more accurate to say “entropy” is the energetic cost of creating order from randomness.
Latest Answers