Site icon Finance Talks

How to reduce PCI DSS Audit scope by tokenizing cardholder data?

Tokenization is a technology or technique that has for long been the talk of the digital payment industry. Today, most Financial Institutes around the world adopt this technique to secure sensitive data, like credit card numbers and bank account numbers when in process, transition, or when stored. With year-on-year growth in incidents of data leaks, theft, or data breach, organisations are pushed to their limits to improve and evolve with the digital landscape. By adopting strategies such as data tokenization, organisations are equipping themselves against such incidents.

While it is clear and evident that incidents of data breach and data theft will only rise in the future, appropriate measures like tokenization are required to be implemented by institutes to secure their business and sensitive customer data.  However, many businesses are yet far from implementing this strategy for the lack of awareness of this technology. Many businesses are still unaware of what tokenization is, and how it works and which is why they end up not adopting this powerful technique that helps secure card data. So, in today’s article, we have explained the concept of tokenization and how it helps secure card data and reduce PCI DSS Audit scope.

What is Tokenization?

Tokenization is a technique that involves replacing sensitive data with non-sensitive elements or numbers that are algorithmically generated and known as a token. Tokenisation works on the principle of lowering the value or lifespan of the data such as to make it unviable for the hackers to try and breach the data defenses. Tokenization can be referred to as the process of replacing a credit card number with an alternate set of characters, or elements that have no significant value. It is a unique process of protecting sensitive data while retaining all the relevant information without compromising its security. A process that is very different from encryption, wherein it does not allow the token to be deciphered and reveal the sensitive data is known as tokenization.

How does Tokenization help keep Cardholder Data Safe?

The key to understanding how tokenization can help secure card data is by understanding how the tokenization works. So, tokenization is the process of replacing sensitive data, such as an account number, into a random string of elements or characters called a token that has no significant value if breached. The token works as a replacement to the original data that cannot be exploited for values. Here there is no key or algorithm, that can be used for deriving the original data for a token. Instead, tokenization uses a database, called a token vault that stores the sensitive real data which is secured by encryption. While the token value which is the replacement to the original data is used in various applications as a substitute for the real data.

If the real data needs to be retrieved the token is submitted to the vault and the index is used to fetch the real value for the authorization process. However, for the end-user, it is a seamless process whereby they are not even aware of the data stored in the cloud in a different format. The benefit of this is even if the tokens are breached they will have no value or meaning to the hacker. Since there is no mathematical relationship to the real data they represent or have any key to reverse engineer them back to the real data values; its high level of data security prevents incidents of Data Leak or breach.  With the original card data safely kept inside the organization’s data environment, this secures the process of card data processing or transmission. For these reasons, tokenization is today a popular technique for protecting payment data, like debit and credit card numbers.

Reducing the PCI Scope with Tokenization

When it comes to businesses falling in the scope of PCI DSS Compliance, it has a significant impact on them in terms of their resources (cost and manpower). Any systems or applications that have access to sensitive card information whether encrypted or not shall fall in scope. Enterprises looking for ways to simplify and reduce the scope of the Payment Card Industry’s Data Security Standard (PCI DSS) compliance can possibly do so by reducing the card data footprints in their systems and applications. This can be achieved by adopting the strategy of tokenization. With this, organizations can significantly lower the cost and increase their chances of a successful PCI Audit.  Given below are some interesting ways how tokenization helps reduce the PCI Scope for the organization-

1. Secure Data Vault-

When an organization adopts the tokenization technique the sensitive payment card data is stored in a highly secure centralized data vault while a token is assigned that replaces the credit card values in applications or databases. This technique drastically reduces the risk exposure of card data even in their encrypted state. The sensitive data is not accessible outside of the data vault, except when originally captured at the beginning of a transaction or, later, accessed from the data vault by an authorized user or an authenticated application.

For instance, in tokenization when the cardholder uses their credit card for purchasing a product online, the sensitive card number is transmitted in real-time to the token server and in return, a token replacing the card data is generated which is used across applications and downstream for further processing of payment. Meanwhile, the sensitive card data is encrypted and stored in the centralized data vault. If in case there is a need to access the original data the value can be decrypted with a request by an authorized user or applications with proper authority. This process significantly limits the risk to the credit card data and ensures that it is only securely stored in an encrypted format in a secure data vault.

2. Data Surrogates –

In tokenization, a token is used as a replacement or as a surrogate value to the original sensitive data. The token works as a representation of the actual data, which is encrypted and stored in a central data vault. Tokens can be engineered to preserve the length and format or preserve parts of the original data values so that applications which are actually created to process real card data do not reject the transaction since they fail the card number verification algorithm – Mod 10. So, typically in PCI DSS tokens are generated to maintain the original first two and last four digits of the credit card. It provides the flexibility to define the format of the token which means deciding on what part, if any, of the original value to preserve. Generally, applications require only the last four digits of the credit card for validating credentials. This significantly reduces the business impact associated with PCI DSS compliance. Since the application does not contain any credit card information including not having it in encrypted format, the entire application in this scenario falls out of the PCI DSS scope.

3. Data Relation Token

Tokenization can even facilitate a one-to-one data relation token between the credit card number and the token to maintain referential integrity across systems. Having referential integrity allows for a transaction analysis with tokens, rather than directly with the credit card numbers. This helps remove it from the scope. However, network segmentation is important and if done properly, the scope of the PCI audit reduces drastically. This in turn reduces the exposure to data breach since any unauthorized access to the data will only provide them access to the tokens and not actual credit card numbers.

4. Tokens Have No Value

Very different from the encryption method wherein an algorithm mathematically derives the output ciphertext based on the data input, there is no mathematical relationship between a token and data value.  The only relationship that holds is referential. Tokens can easily be transmitted across the network between applications, databases, and business processes securely without having to worry of it getting decrypted and an unauthorized person having access to sensitive data. It can be securely transmitted or processed while the original data is encrypted and securely stored in a central data vault. The original data is securely preserved with access only limited to the authorized applications that need access to encrypted data for retrieving it. This way the sensitive data is provided with an extra layer of protection. Further, this strategy facilitates reducing the footprint of card data that otherwise needs to be secured and monitored frequently.

5. Sensitive Key Management-

PCI DSS clearly requires the sensitive keys to be appropriately secured in the least number of locations to prevent any risk exposure. Tokenization restricts the distribution of keys to bare minimum. With the tokenization, this can be easily achieved for the keys are limited to only the authorized central token manager. This way the technique helps minimize the scope of PCI DSS Compliance and reduce the risk of compromise.

Conclusion

With traditional encryption, there is still a possibility of compromise in key management leading to data breach or theft. But, with tokenization, that is not the case for it offers an extra layer of security and facilitates centralized key management. This way, tokens can be safely used throughout the organization thus minimizing the risk of exposing the actual sensitive data, and allowing business and analytical applications to work without any hassle. Further, reducing the scope of PCI Compliance makes it an ideal technique for adoption especially for those businesses looking to reduce their scope and achieve PCI DSS Compliance.

Author Bio

Narendra Sahoo (PCI QSA, PCI QPA, CISSP, CISA, and CRISC) is the Founder and

Director of VISTA InfoSec, a global Information Security Consulting firm, based in the US,

Singapore & India. Mr. Sahoo holds more than 25 years of experience in the IT Industry,

with expertise in Information Risk Consulting, Assessment, & Compliance services. VISTA

InfoSec specializes in Information Security audit, consulting and certification services which

include GDPR, HIPAA, CCPA, NESA, MAS-TRM, PCI DSS Compliance & Audit, PCI PIN, SOC2,

PDPA, PDPB to name a few. The company has for years (since 2004) worked with organizations

across the globe to address the Regulatory and Information Security challenges in their industry.

VISTA InfoSec has been instrumental in helping top multinational companies achieve compliance and secure their IT infrastructure.

Exit mobile version