How does a computer turn binary into text?

38 views

I searched for similar questions here and none of them really gave the answer I’m looking for.
I understand we have the concept of encoding where in the ascii scheme, the letter A, for example, is encoded to 01000001, because humans agreed on this to be so, *but where actually is that A* when the binary calls for it, if physically in the cpu, for this example we just have 8 microscopic transistors, 2 holding a charge and 6 without a charge that we point to and say “That there means A…but it also means 65 in decimal”? Clearly there’s some level of abstraction here I’m not aware of.
Does 01000001 actually just correspond to a standardized pixel arrangement in the shape of A that is only actually rendered by the hardware in the situation where it’s called to be printed on a screen?

In: 0

The binary code of the letter means nothing to the processor or other hardware, it’s just a value. Text processing software is what makes it the letter “A”. The program running on your computer calls a subroutine from the library that operates on this value in a way appropriate for the letter “A”, for example, calls another subroutine from another library that calculates which pixels should be colored to display the letter.

yes, 01000001 is the only representation of A that there is in your cpu.

there is a font, which contains the shape of the letter A, so when it’s time to render 01000001 to the screen, the correct shape in the font is looked up and that’s drawn, according to the font rules for sizing, kerning, etc.

even back in the console days, screen data just had 01000001 in it. the graphics control chip knew where the CRT monitor’s beam was on the screen, determined which screen cell that was in memory, retrieved 01000001 as being the character in that cell, looked up the 8×8 pixel character in the font buffer, determined which pixel it was currently drawing in that character from the beam location, and turned the beam on if it was a lit pixel, and off if it was an unlit pixel.

if it was graphics mode instead of text mode, the pixels were rendered into memory directly, but the same font buffer was used to do so and the beam was coloured according to the current pixel from the graphics memory, so no real difference.

The layer of abstraction you’re thinking of is ASCII, the contract between binary and text that links them together.

That 01000001 corresponds to an ASCII value, which also corresponds with “A”.

[Here’s some more info.](https://www.google.com/amp/s/www.nerdynaut.com/understanding-how-binary-code-get-converted-to-text%3famp)

> As an example, the computer processes the binary 01100101 into its equivalent ASCII decimal value 101, which the computer recognizes as the letter “e” based on ASCII standard conversion. This is then displayed as the letter “e” on the computer screen. The same process is also followed in online applications such as binary to text translator binary to ASCII converter and other code conversion programs.

Hope that helps.

For the number: 01000001 binary means 65 decimal, that’s just how it is. These two are the exact same number, represented in different notations.

Just like the 6 in 65 is worth 60, the leftmost 1 in 01000001 binary is worth 64, the rightmost is worth 1, so 01000001 binary is worth 65. Each spot you go left in binary is worth double the previous one, instead of ten times the previous one like it does in decimal. The computer never even has to convert from binary to decimal because the value is the same.

For the letter: 65 decimal meaning A in ASCII is, as you said, a standard people agreed on. The value by itself means nothing to the processor, it actually means more to people that know the ASCII table than it does to a processor.

Software needs to be written so that the value can be taken all the way from just a number to a letter being presented on your screen, what this software is like will change based on the device that is running the code, if it’s using a segmented display it will run some code to turn some bars on and others off, if it’s using a modern display the process will involve getting the data for the letter A, in whatever font it needs to be displayed in, and calculating what all of the pixels of that will look like, which often means a lot of math, since fonts nowadays are not even defined in pixels, they are defined mathematically so that they can be scaled to any size.

It used to be the case that these codes would point to an address in a ROM chip that would contain the actual character graphics, so you might have

ROM Address 01000001000 = 01111000
ROM Address 01000001001 = 11001100
ROM Address 01000001010 = 11001100
ROM Address 01000001011 = 11111100
ROM Address 01000001100 = 11001100
ROM Address 01000001101 = 11001100
ROM Address 01000001110 = 11001100
ROM Address 01000001111 = 00000000

So the video chip will read the character, look up the graphic for that character, and display it.

In modern computers it’s way more complex, but the idea is the same.

It’s not in the CPU. It’s easiest to talk about the early DOS era. In that time, it’s in RAM, copied initially from a table burned into the BIOS, which is the software that boots the machine and provides some basic capabilities.

There you find a font, 256 cells of 8×8 or 8×16 depending on video mode, in which each character is drawn in binary. As in 1=white, 0=black.

So in position 65 you’ll find a picture of an “A”.

And if you have the [right software](https://int10h.org/blog/img/fontraption-1.png) you can modify that memory, and you’ll see every “A” on the screen change the moment you touch it.

Damn, this sub is delivering.
Each post here so far really helped to demystify things and connect the dots in my understanding. Thanks!

In very short to add to everything here:

Your computer, at base, operates in ones and zeroes, true.

In different areas and saved in different places on your computer are basically dictionaries.

These dictionaries tell the computer what certain things mean in certain contexts.

A program tells your computer “the next line if ones and zeroes is a letter. Look up the letter table and find the letter that matches the following ones and zeroes”

Then it spits out that code and the computer, already told that *this* set is going to be a letter and not, say, a symbol or instruction to do something or graphical element.

The computer then goes here is the ASCII table. Here are the ones and zeroes you gave me. Aha! An “A”

Place that “a” in the word processor, tell the graphics table that PixEls 998,999, and these other pixels to flip to 0 brightness to block out the lights on the monitor so the human can see an “a” made out if the bajillion shutters/lights on the screen.