How does a computer program generate random numbers? Example: when you ask Siri to give you a random number between 1 to 10, how does it come up with that number?

776 viewsMathematicsOther

How does a computer program generate random numbers? Example: when you ask Siri to give you a random number between 1 to 10, how does it come up with that number?

In: Mathematics

18 Answers

Anonymous 0 Comments

A computer can generate random numbers by measuring some external phenomenon with excessive number of digits necessary to fully capture the phenomenon at hand. If you do so for a suitable signal, the rightmost digit(s) of your input data are truly random. Say, record sound but only keep the last digit of each sample. This digit will be mostly thermal noise of the microphone and its electronics, and it is random. To get big enough number, you need to then gather multiple samples, see (*) at the end if interested in this detail.

Siri is unlikely to do anything so complicated. It will just call its own built-in random number generation routine. This might return a truly random number (if Apple has chosen to do so, in which case the number will likely relate to least significant digits of some internal signal to the phone’s microprocessor, like its internal operating voltage measurement). Just equally likely, they might just return a “pseudorandom” number. This means a number derived from something seemingly random like the current time at the time your query with a simple mathematical function that tries to make numbers appear random. The latter are also the type of “random” numbers you would use to have random events in computer games (as they are much faster to generate when nanoseconds count, a true random number from, for example, incoming sound would take about a millisecond to generate).

But, with a modern microprocessor in the phone, both alternatives are equally likely. You would not be able to tell the difference between the two for the purposes of Siri giving you a few random numbers every now and then. They both appear equally random to you, or to any other observer not Siri.

(*) Computers of course work in binary, and most analogue input signals are actually presented by an integer. The last decimal will in reality be the least significant bit of an integer variable that represents the analogue input signal. To get a number from 1 to 10 with approximately equal probabilities, you need to gather (much) more than 4 bits (which would give you 16 possible numbers). Say, gather 16 bits. Turn them into a number from 1 to 65536 and keep the last decimal digit. Not perfectly uniform. Not very efficient. But would work just fine. Of course, to avoid the hassle of implementing such a thing yourself, the microprocessor already contains a built-in instruction to do just this. It likely also gathers the input data before you ever ask for the number so that it can give (the first few) random numbers instantly and will only then have to wait a while to give more such random numbers should you ask way too often. This is why things like games would not use these numbers.

You are viewing 1 out of 18 answers, click here to view all answers.