Entropy, please oh god please.

803 views

I have read COUNTLESS explanations, countless examples, countless ELI5s. But I still have no fucking idea what entropy is. I’m **not** leaving this earth before I understand what it is. Thank you.

In: Physics

16 Answers

Anonymous 0 Comments

I hope you get a decent ELI5 answer so I can hopefully understand what it is. My loose grasp of entropy is the transition from order to chaos.

Anonymous 0 Comments

Over time, shit changes to the temperature of the things next to them. This is true for literally everything, so the temperature of the earth is slowly equalizing to the temperature of the space next to it ( which is very technically not a vacuum since there’s tiny particles in it).

“Entropy” is that idea. Things slowly lose their energy over time until the system has equal energy in it, and then nothing can change since it’s perfectly equal.

“Heat Death”, often referred to along with entropy (so i’m guessing you want an explanation for that too) is when literally everything over trillions of years perfectly equalizes it’s heat, and then perfectly equalizes all of it’s other forms of energy in the form of heat because heat is bled off every time there’s an action (usually in the form of friction). Eventually, literally everything will be an evenly distributed evenly “room temperature” blob, and because there’s literally no potential energy left literally nothing more can happen.

*edit: entropy is also a word which can be used when not referring to the physics concept, and in that case it simply means the general concept of stuff eventually falling apart and becoming useless.

Anonymous 0 Comments

It’s a difficult concept and really due to energy dispersal and statistical likelihoods.

This is one of the easiest to understand videos on the topic I have found.

It won’t make you an expert, but it’ll give a better idea of what entropy actually is.

Anonymous 0 Comments

Basically: Entropy is the possible number of permutations a system can exist in.

It’s not as simple as that really, there’s a lot of thermodynamics and quantum theory involved, and there’s actually a theory that rather than average entropy increasing over time that time increases with the average entropy of the universe, but at then end of the day the simplest way to put it is that things want to have as much freedom as possible.

Eveything from position, orientation, kinetic energy of the individual molecules, distance between the molecules, those all contribute to the entropy of the system.

Anonymous 0 Comments

Consider a system that contains four particles. They each have two states, let’s call them high energy and low energy. We’ll represent this with 0 for low and 1 for high if you want. What are the possible states?

* 0000
* 0001
* 0010
* 0011
* 0100
* 0101
* 0110
* 0111
* 1000
* 1001
* 1010
* 1011
* 1100
* 1101
* 1110
* 1111

So with four particles, and two options each, we have 16 possible states. One thing you can immediately notice, the ordered states are rare. There’s just 0000 and 1111. 2 of 16.

So what is entropy? Literally that. It’s how many possible states there are, and how rare a given one is among them. The 0000 is low entropy. One like 1010 is just as rare, but it has many similar looking ones that together are high entropy. Just replace these the digits with atoms, and the two possible states with much more degrees of freedoms.

If you have a few billion, billion, billion atoms, there’s a lot of ways you could arrange them. A perfect crystal has a few options, but not many. An imperfect solid has a lot more options. A hot gas has way, way, way more options. So there’s just more ways to arrange a bunch of atoms that macroscopically look like a hot gas than there is that macroscopically looks like a perfect crystal. Just as an ordered number like 11111111 is rarer than a messy one like 10111010 if you were to randomly draw the numbers from a hat. And this is why entropy increases. If something changes, it’s way, way more likely to go to a messy state than an ordered one. It’s not impossible, but highly, highly unlikely.

Now you may be saying, my example looks an awful lot like binary numbers. And it is. Entropy and information are really the same thing. Ordering of states. You can measure entropy in bits. But the information we use on a computer is more abstract and of a much, much, much lower scale than the thermodynamic entropy of matter. Better to use J/K, the thermodynamic unit. Now, you may think that looks like some sort of relationship between energy (Joules) and temperature (Kelvin). But let’s roll this back for a second, and consider that temperature is actually the relationship, not entropy. Rather than entropy = energy/temperature. Consider that temperature = energy / entropy. Temperature is a relationship between changes in entropy with changes in energy. How does adding more energy change the possible states. That is temperature.

Anonymous 0 Comments

Entropy is such a complex topic and its uses range from classic thermodynamics to statistical mechanics… but let me try to paint you a picture

A car engine works by converting thermal energy into mechanical energy. The process works with some kind of combustion that violently pushes the pistons and hence make them move. Thermal energy -> mechanical energy.

The first law says you cannot get more energy out than you input. That means the max mechanical energy you can get is either lower or equal to the energy put into combustion.

Furthermore, the second law (which is about entropy) states that you cannot even break even. That means that in reality you can NEVER get as much mechanical energy out for a certain energy input. Crazy, right? You’d expect the maximum efficiency of an engine to be 100%, but in reality you can’t even reach that number because gasoline engines, for instance, are theoretically only capable of being 50% efficient. (You can mathematically prove this) And that’s under the best conditions. We still only achieve efficiencies of 30% in the real world. Pretty crap right?

Real life engines are irreversible processes. That means you cannot reverse the process (duh). In irreversible processes, entropy always increases or stays the same. Thus for a car engine, realistically, the entropy will always increase between combustion and mechanical work. In a way, entropy helps us understand when systems play in “forward time” or “reverse time”.

And that’s just in thermodynamics. Statistical mechanics explains entropy with probabilistic distributions and stuff. Anyway entropy is really fucking complicated, so you shouldn’t be obsessed with learning everything about it unless you want to become a professor.

Anonymous 0 Comments

I’ll give you the answer that helped me as a physics undergraduate.

Entropy is number of microstates corresponding to a particular macrostate. If you’ve not studied physics, that probably means nothing to you, so here’s an example.

Imagine a cup of coffee. It’s filled with coffee and milk mixed together. Let’s say that 80% of the particles in the cup are coffee particles and 20% is milk.

There are certain number of ways you can arrange those particles of milk and coffee in the cup. This is the microstate. If you could somehow break it down and see exactly where each particle is in the mixture, you’d know its microstate.

What do you actually see? You see a cup where the coffee and milk are well mixed together. Even if you don’t stir it, they’ll eventually look like that. That’s the macrostate–the overall state of the system.

An alternative macrostate would be to have the milk all in one layer on top of the coffee, not mixed together at all. There is nothing physically preventing this from happening. But there are only a few ways to arrange the coffee and milk particles that would produce this result. If you could take that cup full of milk and coffee particles and randomise their positions, on almost every randomisation you’d end up with a cup where the particles are evenly mixed. Because there are far more possibilities that produce that result.

That’s what people mean when talking about entropy as disorder. The more ‘disordered’ state has all the particles mixed up, while having them cleanly separated is ‘ordered’. The disordered state has a higher entropy, because there are more ways you can arrange the particles to get that result.

That’s why, if you pour some milk in a cup and leave it, it’ll eventually mix together on its own. Stirring just speeds it up.

Anonymous 0 Comments

(For the record: I don’t understand entropy, and I’ve kind of made my peace with it. I 100% don’t get how information has entropy.)

Part of the reason entropy is hard to teachis that it started as people tried to figure out steam engines (hence a bunch of heat-related explanations), tied to a bunch of woo-woo but also grounded in massive, giant entropy tables for steam. Add in the ability of computers to do massive statistics, and the love of physicists to tie seemingly-unrelated branches of knowledge together, and you get kind of a mess.

Not only is it hard, but it’s hard with a long history.

Anonymous 0 Comments

Did you ever get an answer that helped you understand it? I am not a physicist, but one of my close friends is and they explained very clearly what entropy ‘is’. If you are still confused, let me know and I’ll type out my understanding.

Anonymous 0 Comments

Entropy refers to disorder/energy the higher the entropy the greater the disorder. A big thing to remember is that our definition of chaos and order is different than the thermodynamics definition to us order is everything being even but to thermodynamics it’s everything being concentrated on one point. Everything moves towards disorder but it all wants to be the same level of disorder.

Think about a glass of water that’s all the same temperature at room temp it has a state of high entropy it is disorder. But if you drop an ice cube in the cup all the difference in temperature is organized and consolidated into one point and therefore has low entropy and is orderly but since all things move to disorder eventually the ice will melt and given enough time all the water will be the same temperature and eventually the temperature would seep into the air making the water room temperature making it high entropy.

Basically the universe likes stuff to be the same