Blog

01Oct, 2018

Tokenisation

What is tokenisation and how does it reduce PCI DSS compliance scope?

Since 2005, tokenisation has been one of the most effective methods used by the payment card industry to secure data and descope PCI DSS security compliance. Over time, tokenisation has evolved to be used to secure all kinds of sensitive data including personally identifiable data (PII) such as names, email addresses, date of birth, drivers licence numbers and passport details amongst many others.

What is Tokenisation?

Tokenisation when applied to payment card security and PCI DSS compliance is the practice of substituting the Primary Account Number (PAN), the primary piece of cardholder data with a surrogate value, called tokens. These tokens, if intercepted or stolen through a data breach protect the PAN as they cannot be reversed engineered (when done properly) and traced back to the original PAN. As such financially sensitive credit card data is replaced with none financially sensitive tokens throughout the merchant’s environment, as such the financial risks associated with a data breach are significantly reduced.

Contact Centres have various channels for collecting credit card data. It can come from an Agent IVR system as voice/DTMF tones or from a customer telephone call.  This data moves along the payment processes where at some point the PAN data may be stored to facilitate a transaction, for customer service reasons or recurring payments.  When PAN data is stored, the merchant is obligated under PCI DSS compliance to protect it in accordance with the payment card industry data security standards (PCI DSS).

Related Content: PCI DSS Compliance Guide for Contact and Call Centres

When PAN data is stored, it can also be displayed to agents for customer service reasons which creates multiple points for data leakage and potential opportunities for misuse.

So how does tokenisation manage this risk? When you analyse the process carefully, the PAN isn’t needed in each step of the payment process. In the flow described above the credit card information is only needed at the verification stage within the payment gateway, everywhere else a token with masked PAN data is sufficient.

Tokenisation is how the credit card data is protected. The tokenised credit card information acts as a placeholder in the payment process. For example, a credit card may be tokenised as 6744x# mdew )$cd yu821, where 6744 and 821 correspond to real numbers of the credit card. While this appears simple, to be secure from easy decoding, it is not a simple one to one substitute. Cybercriminals cannot decode the pattern by a technique known as farming, where many substitutions are reverse engineered. Given the significance and sensitivity of this topic PCI Security Council has issued a supplement and quotes

“The security of an individual token relies predominantly on the infeasibility of determining the original PAN knowing only the surrogate value.”

PCI DSS Supplement on Tokenisation

The fundamental principles that govern the tokenisation for PCI as per the tokenisation supplement are outlined below.

  • Tokenisation solutions do not eliminate the need to maintain and validate PCI DSS compliance, but they may simplify a merchant’s validation efforts by reducing the number of system components for which PCI DSS requirements apply.
  • Verifying the effectiveness of a tokenisation implementation is necessary and includes confirming that PAN is not retrievable from any system component removed from the scope of PCI DSS.
  • Tokenisation systems and processes must be protected with strong security controls and monitoring to ensure the continued effectiveness of those controls.
  • Tokenisation solutions can vary greatly across different implementations, including differences in deployment models, tokenisation and de-tokenisation methods, technologies, and processes.
  • Merchants considering the use of tokenisation should perform a thorough evaluation and risk analysis to identify and document the unique characteristics of their particular implementation, including all interactions with payment card data and the particular tokenisation systems and processes. Remembering that the PCI DSS standards apply to any organisation that stores, processes or transmits credit card data.

Choosing the Right Tokenisation Method.  Tokenisation vs Encryption

As you can imagine there are multiple ways to achieve tokenisation. Some may be even tempted to use encryption algorithms, as at a high level they both appear to do cater to the same need. However, there is a subtle difference between encryption and tokenisation.

Encryptions are usually reversible mathematical functions and can be deciphered with the right key and algorithm. They are mostly used to share data confidentiality across systems.  Tokenisation, on the other hand, are not generated mathematically and cannot be deciphered using an encryption key. They are randomly generated and are meant to hide the data, not to share the data. The PAN can only be identified via a token vault which maps the relationship between the token and the original PAN.  Encryption is then applied to the vaults to safeguard the maps.

Tokens can be single-use or multi-use. In single-use systems, they are only used for a single transaction and then discarded. The multi-use tokens are used to track and collaborate the data over multiple systems and may even be stored in databases for reuse. The choice of single-use or multi-use and the methods are dependent on the specific needs of the system and should be made carefully taking into business and security needs of the merchant.

The PCI DSS supplement identifies common ways tokens can be generated but are not limited to:

  • A mathematically reversible cryptographic function, based on a known strong cryptographic algorithm and strong cryptographic key (with a secure mode of operation and padding mechanism)
  • A one-way non-reversible cryptographic function (e.g., a hash function with strong, secret salt)
  • Assignment through an index function, sequence number or a randomly generated number (not mathematically derived from the PAN) 

In the case of hash functions, additional controls need to be added to remove the risk of reverse engineering in particular if the hashed and truncated version of the hash version is present in the same environment.  In this case, to be PCI DSS compliant, other controls need to be put into place to ensure hashed and truncated versions cannot be correlated to reconstruct the original credit card number.

Security of Tokenisation Vaults

Tokenisation Vaults can become the single point of failure, and they need to be guarded with stringent security measures. It is critical to have these systems adhere to PCI DSS controls.

As per the PCI DSS tokenisation supplement, characteristics of a tokenisation system that meets PCI DSS requirements include but are not limited to the following:

  1. The tokenisation system does not provide PAN in any response to any application, system, network, or user outside of the merchant’s defined CDE.
  2. All tokenisation components are located on secure internal networks that are isolated from any untrusted and out-of-scope networks.
  3. Only trusted communications are permitted in and out of the tokenisation system environment.
  4. The tokenisation solution enforces strong cryptography and security protocols to safeguard cardholder data when stored and during transmission over open, public networks.
  5. The tokenisation solution implements strong access controls and authentication measures in accordance with PCI DSS Requirements 7 and 8.
  6. The tokenisation system components are designed to strict configuration standards and are protected from vulnerabilities.
  7. The tokenisation solution supports a mechanism for secure deletion of cardholder data as required by a data-retention policy.
  8. The tokenisation solution implements logging, monitoring, and alerting as appropriate to identify any suspicious activity and initiate response procedures.

What is De-Tokenisation?

De-tokenisation is the act of converting the tokenised PAN data when it is required. The system should be designed to remove any risk of unauthorised de-tokenisation with strong access controls with well-defined roles and stringent authentication mechanisms.

Conclusion

Tokenisation is an instrumental technique to limit the exposure of PAN data in payment systems and to achieve PCI DSS compliance.  On its own, tokenisation does not guarantee PCI DSS compliance but is considered best practice to reduce PCI DSS scope and minimise your PCI DSS compliance burden, costs and risks.

IPSI are leaders in providing tokenisation solutions including cloud-based, omnichannel capabilities with ancillary credit card scanning capabilities. To discuss your tokenisation needs, please contact us on 1300 975 630 or email us at assistance@ipsi.com.au.

Related Articles

The benefits of mandatory data breach notification laws in Australia

Mandatory data breach notification laws would result in greater security for Australians and improved protection of their sensitive information. And i

Read More

Cost of data breach report (with Australian Statistics)

Ponemon Institute 2013 Cost of Data Breach report The 2013 Cost of Data Breach report published by the Ponemon Institute (sponsored by Symantec) revea

Read More

How to survive a data breach

In the past two years, LinkedIn, eHarmony, Twitter, Adobe and, most recently, Target have suffered data breaches that together exposed more than 120 m

Read More

Credit card data discovery tools lay the foundation for good data security

Card Holder Data (CHD) discovery tools are becoming essential in identifying none secure sensitive data locations. Since December 2013, a series of da

Read More