There is a set of standards call Payment Card Industry Data Security Standard (PCI-DSS) which establishes all the rules on how credit cards and related data are managed. This applies to everything from the back-end servers to the credit card machines themselves. PCI-DSS basically establishes two ways servers can manage their data – through encryption or tokenization.
Tokenization is a technology which creates a reversible placeholder ‘token’ for credit cards – so for instance given the card number ‘1111 1111 1111 1111’ you might generate a token ‘9912 3456 7890 1234’. Tokens take advantage from the rules governing credit card Bank Identification Numbers (BINs) and Luhn checks to ensure applications can tell the difference in real cards and tokens. Tokens are widely used for lower security servers – say for a 3rd party vendor doing analytics.
Banks typically create a small set of highly controller servers isolated from the internet that deal with raw credit card numbers. These will be physically enclosed in cages, restricted with firewalls, and require strict rules for access control. PCS-DSS mandates strict rules regarding most aspects of these servers.
Servers inside the restricted credit card environment should use a mix of tokenization and encryption to protect card data. Encryption must be modern, standards-based algorithms – this rules out “new crypto” like block chains or homomorphic encryption. (It should also exclude vault less tokenization technologies, but PCI-DSS isn’t really clear on the issue yet.)
Ideally the only database which would use encryption for storing card numbers would be the implementation of the tokenization systems. Ideally the table structure would use HMAC to create a column of hashes enabling ‘SELECT’ queries, a column of AES-256 encrypted ciphertext for matches, as well as a column of carefully generated tokens based on cryptographically secure randomization.
Latest Answers