Inside the computers it’s all numbers essentially. All 0s and 1s, off and on. These are just arranged in a useful way and standards have been developed so that certain arrangements always mean a certain thing.
So let’s say we have 8 bits that represent characters, this gives us 256 different on and off combos and thus 256 characters to work with. If we say the first 10 are the numbers 0-9, then we can say the following 52 are all 26 uppercase and lowercase letters.
From there, we can just make it examine a particular character and determine if it falls within the first 10 combos, the next 52 combos, or any of the remaining combos, by looking at the specific 8 bits representing it.
There are standards that are followed such as ASCII or Unicode. These tell you which bit patterns in the computer map to which letter/number.
When you receive a text message, or even read this post, there is an area of your computers memory that contains the binary representation of the message, and the computer maps those binary numbers to the specific characters to display.
All the data that a computer interacts with is represented as electric charge. The presence or lack of this charge (“on” and “off”) can be conceptualised as binary numbers (0 or 1). So conceptually all the data a computer deals with are numbers.
Whether that binary data should be interpreted as literal numbers, or characters, or image data, or executable instructions, or whatever, ultimately is determined by the programmer writing the software, using various standards for storing data (file systems, file extensions etc).
Everything is numbers inside. But when you do the final step of displaying the value, the program asks. Is my single 8bit byte value here a number? Or is it supposed to be rendered as an ASCII system letter? Or is it actually a double byte character where I actually have to read the next 8bits as well?
Latest Answers