How do computers KNOW what zeros and ones actually mean?

1.38K views

Ok, so I know that the alphabet of computers consists of only two symbols, or states: zero and one.

I also seem to understand how computers count beyond one even though they don’t have symbols for anything above one.

What I do NOT understand is how a computer knows* that a particular string of ones and zeros refers to a number, or a letter, or a pixel, or an RGB color, and all the other types of data that computers are able to render.

*EDIT: A lot of you guys hang up on the word “know”, emphasing that a computer does not know anything. Of course, I do not attribute any real awareness or understanding to a computer. I’m using the verb “know” only figuratively, folks ;).

I think that somewhere under the hood there must be a physical element–like a table, a maze, a system of levers, a punchcard, etc.–that breaks up the single, continuous stream of ones and zeros into rivulets and routes them into–for lack of a better word–different tunnels? One for letters, another for numbers, yet another for pixels, and so on?

I can’t make do with just the information that computers speak in ones and zeros because it’s like dumbing down the process human communication to the mere fact of relying on an alphabet.

In: 264

47 Answers

Anonymous 0 Comments

The computer itself doesn’t know. The code running on the computer decides. If the code says to add two things, the processor doesn’t care if the bits represent numbers or something else, it will add them as if they were numbers. If you add the bits that represent 2 (00000010) to the bits that represent ‘A’ (01000001), you’ll get some other bits: 01000011, that you can interpret as basically anything – as a number you’ll get 67 and as a letter you’ll get ‘C’, for example.

In other words, if the code says to display 01000101 as a number, you’ll see 71, and if it says to display it as a letter, you’ll see G.

This ability to reinterpret data as whatever you want is a really powerful concept in low-level programming, you can do a lot of neat tricks with it.

However, most programmers don’t deal with this directly – they can say that some data is supposed to be a number, and they’ll get an error if they try to do letter operations on it. However, this type information is thrown away when generating the instructions the processor can work with directly.

You are viewing 1 out of 47 answers, click here to view all answers.