How did ASCII assign letters a specific number? How did that system came to be? How did early computers adapt to it?

264 views

For example: how was the letter A given the binary code of “01000001”? (I really don’t know anything about this but I’m interested)

In: 12

8 Answers

Anonymous 0 Comments

ASCII didn’t come out of nowhere. Before ASCII there was Baudot, and before Baudot there was morse. In Morse code, a sequence of long and short (dots and dashes) 1 and 0 signals (sound or silence, voltage or not voltage, or whatever), allowed a human operator with an electromechanical switch and sound making device to send and receive numbers and letters digitally (ie on a wire or radio signal that is either “on” or “off”. The rate of morse is limited to how fast a human operator can accurately encode and decode.

When typewriters became common, the idea was that instead of a human morse key operator, a machine could be produced that would automatically generate a digital encoding of the key press on a typewriter, and send that to a remote typewriter, so that when I press a key here, the typewriter there types the letter. Baudot was designed with typewriters in mind, so contained the keys on a typewriter, and the necessary control signals (like carriage return and line feed). Different languages had standardised on different typewriters to reflect things like accented letters and the different frequency of letters in the language (eg the French AZERTY layout rather than the English QWERTY layout). Baudot is a 5 bit digital encoding, and in addition to immediate typing and printing, it was possible to punch the code as holes in a paper tape, that could be read electromechanically to allow storage and sending/receiving at a faster rate (the data connection could support a faster speed than a human typist, so having multiple human typists typing to paper tape, and then sending them at the maximum rate enabled a higher throughput.

When computers reached the level of development that text input/output was feasible, the idea was landed upon of using a teletype as an interface. Rather than typing here and printing there, a person could type here, and the digitally encoded text would go to the computer. The computer output would then be sent back to the teletype as a digital signal, which would be printed in the same way as a conventional teletype. This use is why on Unix (and Linux) systems, the serial port is referred to as TTY (for teletype).

There were a couple of problems with using Baudot for computers, though. First, different languages had different encoding standards. Second, to use a computer, you want to be able to send control signals to make the computer do things that are not needed for simple sending and receiving text. Also there are symbols that are useful for computers that are not really useful for simple text.

The idea of ASCII was to create something that was based on the same concept of a digital signal that could be generated by a keyboard, sent to a computer, and the output sent to a typewriter type printer, that would actually be designed with computer use in mind. Baudot was only 5 bit, and that didn’t allow enough different symbols for all the uses computer users wanted, so ASCII was set at 7 bits. That allowed more characters (for example Baudot generally doesn’t have separate upper and lower case letters, ASCII does), as well as the extra control signals. In the same way that upper case letters are produced by holding down the shift key and typing the key for a lower case letter, the control signals are produced by holding down a “control” key and typing the relevant letter. That’s the origin of the CTRL key on a modern computer keyboard.

You are viewing 1 out of 8 answers, click here to view all answers.