Computers don’t know letters, they just know numbers. Original computers didn’t have display screens, all input/output was via punchcards or electronic typewriter/printer. Even in modern computer displays, they don’t ‘know’ letters, they just now that number 65 uses a font that displays a glyph for ‘A’.
The typewriter needed a way to know which letter to print. So sending a number ’10’ would (for example) print an A.
But what if another computer used 16,17,18, etc for A,B,C, etc. ASCII was designed a standard for determining which number would map to which character. There were special control characters defined for typewriter carriage return, backspace, tab, etc.
There was a competing standard from IBM, EBCDIC that defined a different mapping for numbers to characters.
https://en.wikipedia.org/wiki/EBCDIC#/media/File:Blue-punch-card-front-horiz_top-char-contrast-stretched.png
Latest Answers