Shannon’s information theory

784 views

Shannon’s information theory

In: Mathematics

Anonymous 0 Comments

It’s the field of mathematics that quantifies information.

Take a coin dropping situation where you want to know which side appeared. To get that, you need exactly 1 yes/no question. So we’ll say the quantity of information is 1. If you want the results for 2 coins, you need 2 information.

If you toss a dice, then you need 3 yes/no questions to know whise side it landed on. So the quantity of information required to describe a dice result would be 3.

Information theory goes further, by allowing non-integer quantity of information. In the case of the dice, for example, you don’t really need 3 “full” questions. 3 questions would give you access to enough information for an 8-sides dice instead of just 6. Surely that should mean the quantity of information for a normal dice should be lower. But still bigger than 2.

You can check that, if you want to know the results of 3 dices (not just the sum, the full info), you won’t need 9 questions. If you choose your questions carefully enough, you could get it in 8 questions (left as exercize^^). So that’d mean information stored in a dice is closer to 8/3.

While this might look like just fun and games at first, this theory actually has very serious applications, in particular in computer text encoding, where we are looking for the most efficient way to code and transmit a message with 0s and 1s.

You are viewing 1 out of 1 answers, click here to view all answers.