ELI5-What is entropy?

748 views

ELI5-What is entropy?

In: 1763

33 Answers

Anonymous 0 Comments

The statistical definition of entropy is that it is proportional to the number of microstates that can make up a macrostate.
Now what is a macrostate: a hamburger. There are different kinds of hamburgers, but lets say you consider a cheeseburger. Thats our macrostate. What is a microstate: the way you put the parts in the sandwich. You can put the cheese on the lettuce or the lettuce on the cheese. The tomato can be on the top or if you are very energetic the meat patty can be on top, but it is still a cheeseburger in the end. So how many ways can you make a cheeseburger? A lot of ways. So entropy of cheeseburger is high. What if I restricted you to only buns, 1 meat patty and 1 slice of cheese. Well now the ways you can make the cheeseburger is quite limited, so the entropy of this restricted cheeseburger is low.

So entropy tells us how many ways can a system be built from its parts.

You are viewing 1 out of 33 answers, click here to view all answers.