Tokenization to Improve Data Security and Privacy
Palika Sharma
Senior Consultant at IBM - CISA, CISM, CIPM, CEH, AWS CCP, ISO 27001 LA
Protecting sensitive data is a top priority for every organization now a days. With data privacy as a top priority for enterprises, we must find ways to protect sensitive data, while granting data consumers authorization to access the data they need to execute operational and analytical workloads. Tokenization is becoming an increasingly popular way to protect data and can play a vital role in a data privacy protection solution. and in this article, I'll try to answer “What is tokenization? And what are the most common use cases around it.”
Tokenization is an advanced form of pseudonymization that is used to protect the individuals' identity while maintaining the original data’s functionality; it is a process by which PANs, PHI, PII, and other sensitive data elements are replaced by surrogate values or tokens.
Unlike encrypted data, tokenized data is undecipherable and irreversible. There is no mathematical relationship between the token and its original number, tokens cannot be returned to their original form without the presence of additional, separately stored data. As a result, a breach of a tokenized environment does not compromise the original sensitive data.??
Is detokenization possible?
Yes, detokenization is possible. Detokenization is the reverse process, exchanging the token for the original data. However, it can be done only by the original tokenization system. There is no other way to obtain the original number from just the token.??
Two options of tokenization - vault and vaultless:
In vault tokenization, vault database is maintained, in which sensitive data, as well as it’s corresponding non-sensitive data is stored. This table of sensitive and non-sensitive data can be used to detokenize the newly tokenized data. As the data increases, the size of the vault database increases, which in turn increases the processing time for detokenization. This also increases the detokenization implementation process.
To overcome the disadvantages of vault tokenization, vaultless tokenization comes into play, which is more efficient and safer than vault tokenization, as it does not maintain a database, but instead uses secure cryptographic devices. Secure cryptographic devices use standards-based algorithms to convert sensitive data into non-sensitive data or to generate tokens. For detokenization, these tokens can be used to generate original data without needing a tokenization vault database.
Now let us have a look at the use cases around tokenization.
1.????Satisfy Privacy Regulations
Tokens are generally not subject to compliance requirements if there is sufficient separation of the tokenization implementation and the applications using the tokens. Tokenized data isn’t designed to be returnable to its original, identifiable form, rendering it useless for almost anything but very high-level data aggregation and analysis.
Tokenization can be used to satisfy many privacy regulatory standards like GDPR, PCI DSS and many more. The GDPR explicitly states the data-protection principles of the law do not apply to anonymous information.
A common use case for PCI DSS compliance is replacing PAN with tokens in the data sent to a service provider, which keeps the service provider from being subject to PCI DSS.
2.????No need to share sensitive data with service providers
领英推荐
Replacing sensitive data with tokens before providing it to service providers, can eliminate the risk of having sensitive data within service providers’ control and avoid having compliance requirements apply to their environments. This is common for customers involved in the payment process, which provides tokenization services to merchants that tokenize the card holder data and return to their customers a token they can use to complete card purchase transactions.
3.????Restrict sensitive data to “need-to-know” basis
Tokenization can be used to add a layer of explicit access controls to de-tokenization of individual data items, which can be used to implement and demonstrate least-privileged access to sensitive data. For instances, where data may be co-mingled in a common repository such as a data lake, tokenization can help ensure that only those with the appropriate access can perform the de-tokenization process and reveal sensitive data.
4.????Reduces risk from Data Breaches
Tokenization increases the level of difficulty attackers face when attempting to steal tokenized information. Since sensitive data that is tokenized, without a token vault cannot be reversed, so even if the data is stolen, it cannot be reverted to back to its normal form, so it is useless for attackers. If the tokenization is done with a token vault, it is still extremely difficult for hackers to steal the information. Tokenization can't protect your business from a data breach, but it can reduce the financial fallout from any potential breach.
5.????Payment Innovations
Tokenization has made payments safer and improved user experiences, whether online, mobile or in-app. The rising popularity of in-store payments from mobile devices features tokenization. When consumers pay with a mobile wallet such as Apple Pay or Google Pay, their personal credit card data is stored on their phone as a token. Additional security comes from smartphones themselves with the additional layer of biometric security and other advanced authentication measures.?
6.????Cloud workload migration
Many businesses may want to use SaaS for marketing and sales purposes and law of land along with security policies may be derailing the program because PII and other sensitive data is not allowed to leave logical boundary of the organization. As part of business requirement, actual PII and subscriber data will be exposed to cloud / SaaS environment. By virtue of cloud migration / adaption, processing and/or storage of PII and subscriber data in plain text will happen outside the organizations’ logical boundaries and there is a possibility of breach that could lead to fines, litigation, loss of reputation and loss of revenue. Tokenization of PII and subscriber data before it leaves the organization will not only make you compliant but ensure protection from all such risks.
7.????Works with Legacy Systems
Tokenization works well with legacy systems. Even if an application and database have been created and in use for years or even decades, the information secured therein can be tokenized without the need to reinvent, or recreate, the application. Tokenization also uses less resources than encryption does, and has less of a chance of failure, compared to other data masking methods.
?Conclusion
Using tokenization solutions to replace sensitive data offers many security and compliance benefits. These benefits include tapping new opportunities by turning data into a promising asset class, lowering security risks and smaller audit scope, resulting in lower compliance costs and a reduction in regulatory data handling requirements.