Computers don’t “understand” anything really, including binary.
They store data using binary, but binary is just a way of representing a number. 256 in decimal, as a count of bananas, or in binary is the same thing.
Computers certainly can comprehend a number in other bases, decimal included. In the guts of the program the programmer tells the computer the type of input being expected and then translates that to an underlying storage type (consisting of some number or binary digits in the simple case). This is what operations (like adding, etc) are done in. Then when the programmer wants to display it to a user they might display it as a decimal number.
Latest Answers