What is Entropy?

797 views

What is entropy? Also, can anyone explain how it relates to matter, gravity, time, other physics terms?

In: Physics

15 Answers

Anonymous 0 Comments

Entropy is a measure of how unevenly energy is spread out or distributed through a system. This is what they mean by ‘disorder’ which is a confusing term but think of it if you have nice neat stacks of energy over here and here such as a warm spot on the floor in an otherwise cold room, then the room is “low entropy”. But if it was all one temperature that’s “maximum entropy”. well kinda…

However what the Second Law of Thermodynamics describes is that over time, in a Closed System, the entropy will become total, which will mean it will be evenly distributed through the whole system. It’s not in nice neat stacks, it’s shattered into millions of pieces and evenly placed everywhere. A Closed System is a theoretical system where no energy can get in or out,
the Universe as a whole is the only known Closed System, everything else is an ‘open system’ – for example: the earth receives heat from the sun, the sun loses heat out into the solar system etc they’re all open. It’s believed that given enough time the universe will experience ‘heat death’ where there energy will be evenly distributed and thus reach maximum entropy.

I think Feynman described it like a coffee cup, if you shatter it into lots of little pieces then you can’t use the coffee cup anymore. But when it’s neatly in the shape of a coffee cup, you can use it. That’s sort of like energy and entropy I guess. A evenly warm or cold room is the broken cup.

Now where things get confusing is that in the early 20th century a Bell Labs engineer Claude Shannon figured out this amazing theory and equation for working out how much information you could send over a channel without losing any information. He showed it to one of the smartest men in the world, Johann Von Neumann who told him that the equation was very similar to the equation to calculate entropy, and since most people didn’t really understand entropy, he told Shannon to call what he discovered ‘Entropy’ too because it would save him a lot of questions. This is really useful in Artificial Intelligence, in Cryptography, in radio signals… like there’s so many modern applications for Shannon’s “Information Theory” which basically says that you have a receiver, a channel of which there is a certain amount of ‘noise’, and a decoder. Shannon develop equations that tell you the maximum amount of information you can send on that channel before it gets fouled up by the noise.

Shannon Entropy is this counter intuitive property where the less probably an information event is, the more information it contains. For example, if you have a worker who for the last 400 work days has come to work on time at 8am, and it’s 12pm and he’s still not at work – something strange has happened. That’s Shannon Entropy – because his absence or breaking the pattern tells you more than if he arrived as per ussual.

Now this is where the ‘disorder’ thing becomes less confusing because if you have a lot of random events with no patterns, then you approach maximal entropy for a channel. As opposed to nice, neat, consistent patterns – like turning up for work everyday – which is low entropy. So you can kind of backwards apply this to Thermodynamics: Entropy is about the pattern (or lack thereof) something.

You are viewing 1 out of 15 answers, click here to view all answers.