How does a computer turn binary into text?

282 views

I searched for similar questions here and none of them really gave the answer I’m looking for.
I understand we have the concept of encoding where in the ascii scheme, the letter A, for example, is encoded to 01000001, because humans agreed on this to be so, *but where actually is that A* when the binary calls for it, if physically in the cpu, for this example we just have 8 microscopic transistors, 2 holding a charge and 6 without a charge that we point to and say “That there means A…but it also means 65 in decimal”? Clearly there’s some level of abstraction here I’m not aware of.
Does 01000001 actually just correspond to a standardized pixel arrangement in the shape of A that is only actually rendered by the hardware in the situation where it’s called to be printed on a screen?

In: 0

8 Answers

Anonymous 0 Comments

In very short to add to everything here:

Your computer, at base, operates in ones and zeroes, true.

In different areas and saved in different places on your computer are basically dictionaries.

These dictionaries tell the computer what certain things mean in certain contexts.

A program tells your computer “the next line if ones and zeroes is a letter. Look up the letter table and find the letter that matches the following ones and zeroes”

Then it spits out that code and the computer, already told that *this* set is going to be a letter and not, say, a symbol or instruction to do something or graphical element.

The computer then goes here is the ASCII table. Here are the ones and zeroes you gave me. Aha! An “A”

Place that “a” in the word processor, tell the graphics table that PixEls 998,999, and these other pixels to flip to 0 brightness to block out the lights on the monitor so the human can see an “a” made out if the bajillion shutters/lights on the screen.

You are viewing 1 out of 8 answers, click here to view all answers.