[4] To deliver such services, the system maintains a vault database of tokens that are connected to the corresponding sensitive data.
Tokenization may be used to safeguard sensitive data involving, for example, bank accounts, financial statements, medical records, criminal records, driver's licenses, loan applications, stock trades, voter registrations, and other types of personally identifiable information (PII).
In this case, the merchant simply has to retain the token and a reliable third party controls the relationship and holds the PAN.
The communication between the merchant and the third-party supplier must be secure to prevent an attacker from intercepting to gain the PAN and the token.
The security of an individual token relies predominantly on the infeasibility of determining the original PAN knowing only the surrogate value".
[9] The choice of tokenization as an alternative to other techniques such as encryption will depend on varying regulatory requirements, interpretation, and acceptance by respective auditing or assessment entities.
The concept of tokenization, as adopted by the industry today, has existed since the first currency systems emerged centuries ago as a means to reduce risk in handling high value financial instruments by replacing them with surrogate equivalents.
In more recent history, subway tokens and casino chips found adoption for their respective systems to replace physical currency and cash handling risks such as theft.
[13][14] More recently, these concepts have been extended to consider this isolation tactic to provide a security mechanism for the purposes of data protection.
[15] Tokenization was applied to payment card data by Shift4 Corporation[16] and released to the public during an industry Security Summit in Las Vegas, Nevada in 2005.
Tokenization and “classic” encryption effectively protect data if implemented properly, and a computer security system may use both.
This is an important distinction from encryption because changes in data length and type can render information unreadable in intermediate systems such as databases.
With tokenization, specific data is kept fully or partially visible for processing and analytics while sensitive information is kept hidden.
In comparison to encryption, tokenization technologies reduce time, expense, and administrative effort while enabling teamwork and communication.
Tokens can be: single or multi-use, cryptographic or non-cryptographic, reversible or irreversible, authenticable or non-authenticable, and various combinations thereof.
In the context of payments, the difference between high and low value tokens plays a significant role.
In order for an LVT to function, it must be possible to match it back to the actual PAN it represents, albeit only in a tightly controlled fashion.
This requires the storage, management, and continuous backup for every new transaction added to the token database to avoid data loss.
Another problem is ensuring consistency across data centers, requiring continuous synchronization of token databases.
[25][26] Random-number generators have limitations in terms of speed, entropy, seeding and bias, and security properties must be carefully analysed and measured to avoid predictability and compromise.
With tokenization's increasing adoption, new tokenization technology approaches have emerged to remove such operational risks and complexities and to enable increased scale suited to emerging big data use cases and high performance transaction processing, especially in financial services and banking.
[31] Using blockchain, as opposed to relying on trusted third parties, it is possible to run highly accessible, tamper-resistant databases for transactions.
[37][38] The tokenization of equity is facilitated by STOKR, a platform that links investors with small and medium-sized businesses.
Tokens issued through the STOKR platform are legally recognized as transferable securities under European Union capital market regulations.
Storage of tokens and payment card data must comply with current PCI standards, including the use of strong cryptography.
[46] PCI DSS is the most frequently utilized standard for Tokenization systems used by payment industry players.