How do computers know what to do with binary 1’s and 0’s?

312 views

I’m very interested in learning how computers work, but can’t seem to find the exact information I’m looking for. My understanding is, and **please** correct me if I’m wrong, is that if you press the letter “A” on a keyboard, a circuit underneath will close which sends electricity to wires, and based on the combination of voltages on the wires, the computer outputs an “A”. But how does the computer know what do to with voltages? What do the voltages represent? At what point does any of this information get converted into binary, and once it does, what happens?

I don’t expect someone to be able to explain this like I’m five. For me, it’s a difficult, but really interesting subject. Any clarification and dumbing down is appreciated! I’m really hoping to get a better grasp on my understanding of all this.

Edit: I should’ve made the title “How do computers work?” Still wondering how computers know what to do with 1’s and 0’s, though.

In: 0

13 Answers

Anonymous 0 Comments

0’s and 1’s are grouped in multiples. Although a single bit almost never represents anything, it is almost alwyas the sequence in such a grouping that gives meaning.

The meaning is by convention.

Just like a pencilestrike on a piece doesn’t necessarily mean anything, 3 lines in the form of an A, might mean ‘A’ as we all agreed to it, and by making combinations, we can create further meanings (words, sentences, …)

In the early days of computing, there were multiple incompatible conventions, such as ASCII and EBCDIC, now we are mostly aligned.

You are viewing 1 out of 13 answers, click here to view all answers.