What is Tokenization?

Tokenization is a process by which the primary account number (PAN) is replaced with a surrogate value called a token.‖ De-tokenization is the reverse process of redeeming a token for its associated PAN value. The security of an individual token relies predominantly on the infeasibility of determining the original PAN knowing only the surrogate value.

Depending on the particular implementation of a tokenization solution, tokens used within merchant systems and applications may not need the same level of security protection associated with the use of PAN. Storing tokens instead of PANs is one alternative that can help to reduce the amount of cardholder data in the environment, potentially reducing the merchant’s effort to implement PCI DSS requirements.

The following key principles relate to the use of tokenization and its relationship to PCI DSS:

  • Tokenization solutions do not eliminate the need to maintain and validate PCI DSS compliance, but they may simplify a merchant’s  validation efforts by reducing the number of system components for which PCI DSS requirements apply.
  • Verifying the effectiveness of a tokenization implementation is necessary and includes confirming that PAN is not retrievable from any system component removed from the scope of PCI DSS.
  • Tokenization systems and processes must be protected with strong security controls and monitoring to ensure the continued effectiveness of those controls.
  • Tokenization solutions can vary greatly across different implementations, including differences in deployment models, tokenization and de-tokenization methods, technologies, and processes. Merchants considering the use of tokenization should perform a thorough evaluation and risk analysis to identify and document the unique characteristics of their particular implementation, including all interactions with payment card data and the particular tokenization systems and processes.

One of the primary goals of a tokenization solution should be to replace sensitive PAN values with non-sensitive token values. For a token to be considered non-sensitive, and thus not require any security or protection, the token must have no value to an attacker.

Tokens come in many sizes and formats. Examples of some common token formats are included in the following table.

Tokens can be generally identified as either single-use or multi-use. A single-use token is typically used to represent a specific, single transaction. A multi-use token represents a specific PAN, and may be used to track an individual PAN across multiple transactions. A multi-use token always maps a particular PAN value to the same token value within the tokenization system. Determining whether single-use or multi-use tokens, or a combination of both, are appropriate for a particular merchant environment will depend on the merchant’s specific business need for retaining tokens.

When evaluating a tokenization system, it is important to consider all elements of the overall tokenization solution. These include the technologies and mechanisms used to capture cardholder data and how a transaction progresses through the merchant environment, including transmission to the processor/acquirer. The tokenization solution should also address potential attack vectors against each component and provide the ability to confirm with confidence that associated risks are addressed.

The security and robustness of a particular tokenization system is reliant on many factors, including the configuration of the different components, the overall implementation, and the availability and functionality of security features for each solution.