Quantum bits are not “both on and off”. It’s better think in terms of the types of numbers the can represent. Traditional digital bits can only represent two numbers, 0 or 1. You can do a lot with that, but there are some things that will “take forever” if you tried doing it on a regular computer because of how many calculations you’d need to make.
Think of quantum bits (“qubits) as being able to represent *any* number from 0 to 1. We’re no longer stuck with just 0 or 1, we can set our qubits to whatever value we need. And that opens up some wild possibilities: even though a single qubit represents a single number, that number can be so precise that we can treat them as patterns instead of just numbers. For example, 0.124512331277 is just a number, but we can *also* treat it as a convenient way to represent the seven different values 12, 45, 12, 33, 12, and 77 in a single number.
If we can then figure out a way to perform calculations such that the pattern is preserved (e.g. the first two digits are one value, the next two digits another, and so on) then we can calculate seven different things at the same time. And that scales: this allows us to run certain calculations using quantum computers thousands, millions, or even more times faster than a traditional computer, because we don’t have to rerun the same computation again and again until we’ve processed all the inputs one by one. The quantum computer does the work on all our inputs at the same time.
That does come with a downside: while we can “prepare” a qubit to be a specific number (there are some operations that we can perform on them to change them by some know amount, so we can get them in the initial state that we need, similar to how you would prepare a traditional set of bits to represent your starting values) we can’t just look at a qubit to see what value it is: when we try to read it, we’re essentially rounding the number to a whole number again, so quantum computers are used to run algorithms where the computations need to run for lots of inputs all at the same time, but result in something that be represented as 1s and 0s, representing one outcome. That may sound like a deal breaker, but a lot of computational tasks be phrased as “give me _an_ answer to this problem” rather than “give me _the_ answer to this problem”, and plenty of tasks exists where there is only one answer to find, if there is one.
So quantum computers are useless for everyday computing because they work completely different from “normal” computers, but they’re incredibly important to science, and things that rely on science (which a lot of industries do) for the same reason.
Latest Answers