How tf does binary code work.

630 views

it’s just a bunch of 0’s and 1’s.. like I can just put a bunch of them and somehow make a sentence??? like what does this mean -> 010100101001010010 (i legit just spammed 0’s and 1’s)

In: 0

26 Answers

Anonymous 0 Comments

Think of it as a number system, which is great. But it can be so much more.

To make it simple. Let’s say you want to write out text using only numbers. Let’s say 65 = a, 66=B, 67=C, 97=a, etc. I can write Wizywig as 87, 105, 122, 121, 119, 105, 103

That’s cool, but how can you convert 87105122121119105103 into something usable…? That’s where character encoding comes in. Let’s say every 3 numbers will be interpreted together. So all you gotta do is look at 3 numbers, and boom character. If the character is less than 3 numbers, hey lets add a 0 in front. So 087105122121119105103 boom easy peasy, you don’t need any commas.

Well… Decimal kinda sucks, but hey what if we only had a 1 and a 0 as the only numbers available. So like 0 = 0, 1 = 1, but how do we go above 1? Well just like you would go above 9, you move it to the next “place”. So 10 = 2 (1 in the “2”s place, and 0 in the “1”s place). ETC.

So it turns out the maximum number you can represent with a BIT is 1 (0 or 1), 2 bits = 3 (11), 3 bits = 7 (111), etc. So with 8 bits you can represent a total of 255 numbers. And that’s what we call ASCII. Eventually we went with more interesting formats, but that’s more complex. Buuuuuuut now I can encode Wizywig as binary, just gotta convert those numbers into binary:

W = 87 = 01010111

i = 105 = 01101001

etc.

So 0101011101101001 will be “Wi”

Yay we have a way to encode characters we type. We can add more things in here like numbers, letters, punctuation, even invisible characters like “new line” or even a space.

Wait, you may say… How does a computer actually use binary to run its programs? Same kind of idea. What if we read binary a bit differently… We start reading binary for an _instruction_ then the instruction dictates what is needed and will read more until it is satisfied, and then we figure out what to do next. And that’s how a processor reads info. Let’s say every 8 numbers is an instruction on the CPU, then following that would be whatever is appropriate for that instruction, then the next 8 is yet another one. So we keep consuming bits until we have what we need.

Well that seems pretty darn fragile. One bit off and the entire program becomes meaningless. YES EXACTLY!!!!!!!! That’s what we use compilers and validators for. And back in the days of punch cards, that’s exactly what happened. And in current days, a “buffer overflow attack” is exactly using this vulnerability to take over computers by writing our own instructions.

But assuming you made a _perfect_ set of instructions… well fantastic, the CPU can read one, and that can do things like saying “check register A against 0, if it is 0 jump to the instruction numbered <next 8 bits> otherwise continue to the next instruction that is starting 8 bits later. And there are a few hundred commands the cpu can do to move numbers around through its logical operators and through billions of these motions per second you get to read reddit.

The easiest way to visualize a computer program is… imagine you had 8 switches and a piece of paper. On each line it said a combination of switches to flip, which in turn caused other switches to flip in other parts of the machine that you don’t actually get to control, but eventually there’s a spot where you read the switches which tell you the next line in the paper to read. You read that line and it tells you what switches to flip, etc, etc, etc, etc. Now just imagine someone is sitting on the other side of your giant board. And on that side those switches are actually colored, so as you’re flipping switches colors are moving across the board on the other side. And that’s how you get “output” or images using a computer.

You are viewing 1 out of 26 answers, click here to view all answers.