What does the concept of entropy mean?

654 views

What does the concept of entropy mean?

In: 96

28 Answers

Anonymous 0 Comments

Entropy is a statistical property that relates the microstates of a particular system (the exact state of every element in the system) to the system’s macrostate (the overall, high level state of the system).

A good example would to consider a set of 5 coins. The macrostate of 5 heads has only one microstate, 5 heads:
H,H,H,H,H

The macrostate of 4 heads has 5 possible microstates:

T,H,H,H,H

H,T,H,H,H

H,H,T,H,H

H,H,H,T,H

H,H,H,H,T

The macrostate of 3 heads has 10 possible microstates I believe:

T,T,H,H,H

T,H,T,H,H

T,H,H,T,H

T,H,H,H,T

H,T,T,H,H

H,T,H,T,H

H,T,H,H,T

H,H,T,T,H

H,H,T,H,T

H,H,H,T,T

The multiplicity of a particular macrostate is the number of microstates corresponding to the same macrostate. In the above examples, 5 heads has a multiplicity of 1, 4 heads has a multiplicity of 5, 3 heads has a multiplicity of 10, 2 heads will also have a multiplicity of 10, 1 heads has a multiplicity of 5, and 0 heads will have a multiplicity of 1. The entropy of a system is (Boltzmann’s constant multiplied by) the log of the multiplicity.

When you flip a number of coins, while you can’t make any prediction on the particular sequence you will get, you can be reasonably confident that you’re likely to get an even split of heads and tails, a high entropy state, compared to all heads or all tails, a low entropy state.

You are viewing 1 out of 28 answers, click here to view all answers.