what the context of creation of the ASCII and UTF-8 ?

489 viewsOtherTechnology

Each time I dive into the subject I get lost , like WTF !?

were they used to convert binary voltage states (bytes) into display signal ?

were they invented as a guide for a software to translate 1 and O (that are NOT bytes) ?

is it both because of convenience ?

what is it ???

In: Technology

5 Answers

Anonymous 0 Comments

Bytes are not binary voltage states. They are sets of 8 bits that can be interpreted in any way. Bytes can be stored and transmitted bit by bit. One of the first text encodings was [Baudot code](https://en.wikipedia.org/wiki/Baudot_code) invented in 1870s. It was based on sets of 5 bits most likely because the Latin alphabet requires 5 bits to encode. Murray code added Carriage Return and Line Feed codes demonstrating that you can assign arbitrary meaning to codes.

You can transmit and store Baudot and Murray encoded text as a sequence of sets of 5 bits. Similarly you can transmit and store ASCII encoded text as sequence of sets of 7 bits but that would be inconvenient from software point of view. If CPUs had instructions like “retrieve 7 bits from RAM” and “retrieve 7 bits from RAM at position N” you could see ASCII stored as a sequence of sets of 7 bits.

The reason CPUs don’t have “retrieve 7 bits from RAM at position N” instruction is because multiplying N by 7 requires a full blown multiplier that would take many transistors to implement and 1-2 cycles to perform whereas multiplying N by 8 requires only shifting bits and inserting three zeros: binary 1001 times 8 is 1001000. It requires virtually no transistors and no cycles to perform.

You are viewing 1 out of 5 answers, click here to view all answers.