Entropy, please oh god please.

832 views

I have read COUNTLESS explanations, countless examples, countless ELI5s. But I still have no fucking idea what entropy is. I’m **not** leaving this earth before I understand what it is. Thank you.

In: Physics

16 Answers

Anonymous 0 Comments

Entropy is such a complex topic and its uses range from classic thermodynamics to statistical mechanics… but let me try to paint you a picture

A car engine works by converting thermal energy into mechanical energy. The process works with some kind of combustion that violently pushes the pistons and hence make them move. Thermal energy -> mechanical energy.

The first law says you cannot get more energy out than you input. That means the max mechanical energy you can get is either lower or equal to the energy put into combustion.

Furthermore, the second law (which is about entropy) states that you cannot even break even. That means that in reality you can NEVER get as much mechanical energy out for a certain energy input. Crazy, right? You’d expect the maximum efficiency of an engine to be 100%, but in reality you can’t even reach that number because gasoline engines, for instance, are theoretically only capable of being 50% efficient. (You can mathematically prove this) And that’s under the best conditions. We still only achieve efficiencies of 30% in the real world. Pretty crap right?

Real life engines are irreversible processes. That means you cannot reverse the process (duh). In irreversible processes, entropy always increases or stays the same. Thus for a car engine, realistically, the entropy will always increase between combustion and mechanical work. In a way, entropy helps us understand when systems play in “forward time” or “reverse time”.

And that’s just in thermodynamics. Statistical mechanics explains entropy with probabilistic distributions and stuff. Anyway entropy is really fucking complicated, so you shouldn’t be obsessed with learning everything about it unless you want to become a professor.

Anonymous 0 Comments

Consider a system that contains four particles. They each have two states, let’s call them high energy and low energy. We’ll represent this with 0 for low and 1 for high if you want. What are the possible states?

* 0000
* 0001
* 0010
* 0011
* 0100
* 0101
* 0110
* 0111
* 1000
* 1001
* 1010
* 1011
* 1100
* 1101
* 1110
* 1111

So with four particles, and two options each, we have 16 possible states. One thing you can immediately notice, the ordered states are rare. There’s just 0000 and 1111. 2 of 16.

So what is entropy? Literally that. It’s how many possible states there are, and how rare a given one is among them. The 0000 is low entropy. One like 1010 is just as rare, but it has many similar looking ones that together are high entropy. Just replace these the digits with atoms, and the two possible states with much more degrees of freedoms.

If you have a few billion, billion, billion atoms, there’s a lot of ways you could arrange them. A perfect crystal has a few options, but not many. An imperfect solid has a lot more options. A hot gas has way, way, way more options. So there’s just more ways to arrange a bunch of atoms that macroscopically look like a hot gas than there is that macroscopically looks like a perfect crystal. Just as an ordered number like 11111111 is rarer than a messy one like 10111010 if you were to randomly draw the numbers from a hat. And this is why entropy increases. If something changes, it’s way, way more likely to go to a messy state than an ordered one. It’s not impossible, but highly, highly unlikely.

Now you may be saying, my example looks an awful lot like binary numbers. And it is. Entropy and information are really the same thing. Ordering of states. You can measure entropy in bits. But the information we use on a computer is more abstract and of a much, much, much lower scale than the thermodynamic entropy of matter. Better to use J/K, the thermodynamic unit. Now, you may think that looks like some sort of relationship between energy (Joules) and temperature (Kelvin). But let’s roll this back for a second, and consider that temperature is actually the relationship, not entropy. Rather than entropy = energy/temperature. Consider that temperature = energy / entropy. Temperature is a relationship between changes in entropy with changes in energy. How does adding more energy change the possible states. That is temperature.

Anonymous 0 Comments

Basically: Entropy is the possible number of permutations a system can exist in.

It’s not as simple as that really, there’s a lot of thermodynamics and quantum theory involved, and there’s actually a theory that rather than average entropy increasing over time that time increases with the average entropy of the universe, but at then end of the day the simplest way to put it is that things want to have as much freedom as possible.

Eveything from position, orientation, kinetic energy of the individual molecules, distance between the molecules, those all contribute to the entropy of the system.

Anonymous 0 Comments

It’s a difficult concept and really due to energy dispersal and statistical likelihoods.

This is one of the easiest to understand videos on the topic I have found.

It won’t make you an expert, but it’ll give a better idea of what entropy actually is.

Anonymous 0 Comments

Over time, shit changes to the temperature of the things next to them. This is true for literally everything, so the temperature of the earth is slowly equalizing to the temperature of the space next to it ( which is very technically not a vacuum since there’s tiny particles in it).

“Entropy” is that idea. Things slowly lose their energy over time until the system has equal energy in it, and then nothing can change since it’s perfectly equal.

“Heat Death”, often referred to along with entropy (so i’m guessing you want an explanation for that too) is when literally everything over trillions of years perfectly equalizes it’s heat, and then perfectly equalizes all of it’s other forms of energy in the form of heat because heat is bled off every time there’s an action (usually in the form of friction). Eventually, literally everything will be an evenly distributed evenly “room temperature” blob, and because there’s literally no potential energy left literally nothing more can happen.

*edit: entropy is also a word which can be used when not referring to the physics concept, and in that case it simply means the general concept of stuff eventually falling apart and becoming useless.

Anonymous 0 Comments

I hope you get a decent ELI5 answer so I can hopefully understand what it is. My loose grasp of entropy is the transition from order to chaos.