Tokenization

Tokenization

Tokenization involves substituting a sensitive identifier, such as a unique ID number or other personally identifiable information (PII), with a non-sensitive equivalent known as a "token." These tokens have no intrinsic or exploitable meaning or value and are utilized instead of identifiers or PII to represent users in databases or during transactions like authentication. The process of mapping original data to a token typically employs methods like randomization or hashing algorithms, rendering tokens practically impossible to reverse without access to the tokenization system.

While not a new technology, tokenization has been extensively used in credit and debit card systems to replace card data, such as the primary account number (PAN), with unique, randomly generated tokens. This substitution minimizes the number of systems with access to the original card data, thereby reducing the risk of fraud in case of system compromise.

From a privacy perspective, tokenization safeguards privacy by ensuring that only tokens, rather than permanent identity numbers or other PII, are exposed or stored during transactions. Moreover, if the same person is represented by different tokens in various databases, tokenization can restrict the proliferation of a single identifier, mitigating privacy risks and potential fraud.

Tokenization vs. Encryption

Source: World Bank

Key features of tokens include uniqueness and the inability for service providers or unauthorized entities to reverse engineer the original identity or PII from the token. Tokenization typically falls into two primary categories:

1. Front-end tokenization: Users generate tokens as part of an online service, which can subsequently be used in digital transactions instead of the original identifier value. While this approach empowers users, it may exacerbate digital divides due to technical requirements and digital literacy barriers.

2. Back-end tokenization: Identity or token providers tokenize identifiers before sharing them with other systems, thereby controlling data correlation and limiting the spread of original identifiers. Back-end tokenization occurs automatically without user intervention, reducing the risk of digital divides and protecting identifiers and PII at the source.

UIDAI also introduced back-end tokenization to address the storage of Aadhaar numbers in service provider databases

Source: World Bank
Source: World Bank
Source: World Bank

While both tokenization and encryption obscure personal data, they do so differently. Tokenization is often simpler and cheaper to implement than encryption, with a lower impact on relying parties. However, it requires means of mapping tokens to actual identifier or PII data values, which can pose scalability challenges. Nonetheless, implementations like Verify and Aadhaar manage tokenization at scale effectively without the need to share data for authentication purposes.

要查看或添加评论,请登录

Deepak Sahoo的更多文章

  • Sustainable finance Solutions that India Needs

    Sustainable finance Solutions that India Needs

    India faces an urgent need for sustainable development solutions to address environmental challenges like climate…

  • Leveraging Data to Enhance Cost Efficiency and Carbon Transparency at Lufthansa

    Leveraging Data to Enhance Cost Efficiency and Carbon Transparency at Lufthansa

    Overview: Lufthansa Group, a prominent player in the global aviation industry, operates a vast network of passenger…

  • Voluntary Carbon Market

    Voluntary Carbon Market

    In the ongoing battle against climate change, major corporations like Microsoft, Google, and Starbucks are setting…

  • Impact of Greenwashing in Voluntary Carbon Market (VCM) and Secondary Markets

    Impact of Greenwashing in Voluntary Carbon Market (VCM) and Secondary Markets

    The push for net zero targets among companies’ fuels the demand for Verified Carbon Credits (VCMs). However, there has…

  • WHAT IS GREENWASHING?

    WHAT IS GREENWASHING?

    Greenwashing arises when an entity portrays itself as environmentally responsible in a manner that deceives or…

  • AI work in fraud detection

    AI work in fraud detection

    Artificial intelligence (AI) applied in fraud detection entails the utilization of algorithms that actively monitor…

    2 条评论
  • Sustainable Blue Economy

    Sustainable Blue Economy

    The language surrounding sustainability in relation to the ocean is diverse, as various global institutions embrace…

  • Digital Assets Derivatives

    Digital Assets Derivatives

    The ISDA Digital Asset Derivatives Definitions, known as the "Digital Asset Definitions," are crafted to facilitate…

  • Redefining the Role of AI in the Financial Sector

    Redefining the Role of AI in the Financial Sector

    AI is poised to propel growth in financial services. Numerous organizations have embraced digitalization, discovering…

    1 条评论
  • Green Bond

    Green Bond

    Green bonds facilitate the raising of capital and investment for both new and ongoing projects that offer environmental…

社区洞察

其他会员也浏览了