I know the jist of binary, but something that I have never understood is how it know where one “word” stops and another starts.
so if 9 is 1001 and 57 is 111001, how does the computer know that the “1001” in 111001 isn’t whatever 11 is and then the number 9, 1001? only having two digits, 1 and 0 seems like not enough to differentiate.
If you want to have the word “apple” written in binary, do you take the binary code of each letter and smoosh them together or is there a separate specific code for “apple”?
In: Technology
I’m not a computer expert, so maybe I’ll get a “Well actually”, but I did build an 8 bit computer on breadboards. The bus is 8 bits wide, so while you only need 1001 to indicate a 9, the computer sends the full 8 bits for each data transfer, so it puts 00001001 on the bus when transferring a 9 between modules. Since every data transfer between 8 bit modules uses the full 8 bits, there is no confusion as to when a value starts and stops. The memory only uses 4 bits, so only the first 4 bits are read from the bus and the last 4 bits ignored. I would expect a more modern computer using a 32 or 64 bit bus operates in a similar manner.
Latest Answers