ELI5-What is entropy?

800 views

ELI5-What is entropy?

In: 1763

33 Answers

Anonymous 0 Comments

Roughly speaking, entropy is the amount of information required to describe a system. For example, take a system of 10 coins, numbered 1 to 10. If the coins are showing all heads, you can simply say `10H` to describe the system. Thats 3 characters. Change the 5th coin to show tails. Now your description of the system will be `4H 1T 5H`, requiring 6 characters. If the distribution of the coins is completely random, only way for you to describe it is to write it out in full, requiring 10 characters. The last case has the most entropy, the first case the least.

You are viewing 1 out of 33 answers, click here to view all answers.