Card numbers are usually generated using a mathematical algorithm. For instance, I believe credit card numbers are often generated using [modular arithmetic](https://en.wikipedia.org/wiki/Modular_arithmetic) algorithms. Because the card number is the result of a specific set of mathematical steps the computer can instantly determine if a card number is valid or not. For example, due to the steps of the algorithm, it may not be possible for the 10th digit to be a 7 so when you try to enter a 7 as the 10th digit the computer automatically knows it’s the wrong number.

Here’s another way it might work, other than a checksum:

The card company only issues card numbers that are more than one typo away from all other existing card numbers. This why there are 16 digits in the card number, but only 10 digits (7 billion) worth of people in the work – most of the possible combinations of digits in the 16 digit card number are ruled out using this rule.

It’s a basic error check called a luhn check. There’s a standard equation the machine runs against the 1st 15 (or 17) digits in a card, the answer to this has to exactly match the last number on the long card number (PAN).

It’s not really a security feature but it will almost always prevent an incorrect card number being entered by mistake for manual entry, and if course fraud.

## Latest Answers