Tokenization

Tokenization

?ALL ABOUT TOKENISATION

?

INTRODUCTION

?Brief History of Tokenism

nbsp;

The term “tokenism” first rose to prominence in the 1950s in the speeches of Martin Luther King Jr. and Malcolm X. Both civil rights leaders denounced tokenism as a form of hypocrisy in the fight against racial segregation in the United States, arguing that the practice strives for only a minimal acceptance of marginalized groups within mainstream society, especially the experiences of Black people within a white majority group.

There’s been hype around digital-asset tokenization for years, since its introduction back in 2017. But despite the big predictions, it hasn’t yet caught on in a meaningful way. We are seeing slow movement: US-based fintech infrastructure firm Broadridge?now facilitates?more than $1 trillion monthly on its distributed ledger platform. ?Discuss into how tokenization works and what it might mean for the future. Tokenization is not a new technology. In credit and debit card systems, for example, tokenization has long been used to replace data on the card (e.g. the primary account number or PAN), with a unique randomly generated token that can be used to represent the card data in transactions but does not reveal the original card data. This means that the number of systems with access to the original card data is dramatically reduced, and with it the risk of fraud should a system become compromised. What’s more, since tokenization debuted five years ago, many financial services companies have significantly grown their digital asset teams and capabilities. These teams are experimenting more and continually expanding their capabilities. As digital asset teams mature, we may see tokenization increasingly used in financial transactions.

But progress hasn’t been linear: one major setback has been the meltdown of the cryptocurrency market in 2022, triggered by multiple cryptocurrency failures and high-profile cases of fraud. Regulators are paying increased attention to Web3 players, and public curiosity is peaking. Web3 is about?much more than crypto. Blockchain, smart contracts, and digital assets—the latter created via a process called tokenization—stand to change the way we exchange ideas, information, and money. For organizations and early adopters, there is significant value on the table. Some industry leaders believe tokenization stands to?transform?the structure of financial services and capital markets by letting asset holders reap the benefits of blockchain, including 24/7 operations and data availability. Blockchain also offers faster transaction settlement and a higher degree of automation (via embedded code that only gets activated if certain conditions are met).

Definition

?

Tokenization is the process of issuing a digital representation of an asset on a block chain. ‘Tokenization is the process of issuing a digital representation of an asset on a block chain. Tokenization is?a process by which PANs, PHI, PII, and other sensitive data elements are replaced by replacement values, or tokens. Tokenization is really a form of encryption, but the two terms are typically used differently.”

?

“Tokenization?refers to a process by which a piece of sensitive data, such as a credit card number, is replaced by a surrogate value known as a token. The sensitive data still generally needs to be stored securely at one centralized location for subsequent reference and requires strong protections around it. The security of a tokenization approach depends on the security of the sensitive values and the algorithm and process used to create the surrogate value and map it back to the original value.”

?

We’re progressing toward?the next era of the internet in fits and starts. Web3 is said to offer the potential of a new, decentralized internet, controlled by participants via blockchains rather than profit-motivated corporations. But progress hasn’t been linear: one major setback has been the meltdown of the cryptocurrency market in 2022, triggered by multiple cryptocurrency failures and high-profile cases of fraud. Regulators are paying increased attention to Web3 players, and public curiosity is peaking. But Web3 is about?much more than crypto. Blockchain, smart contracts, ?Tokenization can create several types of tokens. Stable coins, a type of cryptocurrency pegged to real-world money designed to be fungible, or replicable, are one example.?Another type of token is an NFT—a non-fungible token, or a token that can’t be replicated—which is a digital proof of ownership people can buy and sell. Tokenization is potentially a big deal. Industry experts have forecast?up to $5 trillion?in tokenized digital-securities trade volume by 2030.There’s been hype around digital-asset tokenization for years, since its introduction back in 2017. But despite the big predictions, it hasn’t yet caught on in a meaningful way. We are seeing slow movement: US-based fintech infrastructure firm Broad ridge?now facilitates?more than $1 trillion monthly on its distributed ledger platform.

We’ll nitty gritty of how tokenization works and what it might mean for the future. Before we dig deeper into tokenization, let’s get some basics defined. As we’ve seen, Web3 is a new type of internet, built primarily on three types of technology:

Blockchain. A?blockchain?is a digitally distributed, decentralized ledger that exists across a computer network and facilitates the recording of transactions. As new data are added to a network, a new block is created and appended permanently to the chain. All nodes on the blockchain are then updated to reflect the change. This means the system is not subject to a single point of control or failure.

Smart contracts. Smart contracts are software programs that are automatically executed when specified conditions are met, like terms agreed on by a buyer and seller. Smart contracts are established in code on a blockchain that can’t be altered.

Digital assets and tokens. These are items of value that only exist digitally. They can include cryptocurrencies, stable coins, central bank digital currencies and NFTs. They can also include tokenized versions of assets, including real things like art or concert tickets. As we’ll see, these technologies come together to support a variety of breakthroughs related to tokenization.

?

Introducing?Direct answers to complex questions

While yet to be tested at scale, tokenization’s potential benefits include the following: Faster transaction settlement, fueled by 24/7 availability. At present, most financial settlements occur two business days after the trade is executed (or T+2); in theory, this is to give each party time to get their documents and funds in order. The instant settlements made possible by tokenization could translate to significant savings for financial firms in high-interest-rate environments. It helps to prune Operational cost savings, delivered by 24/7 data availability and asset programmability. ding operations such as interest calculation and coupon payment into the smart contract of the token would automate these functions and require less hands-on human effort.

  • Democratization of access. Enhanced transparency?powered by smart contracts. Smart contracts are?sets of instructions?coded into tokens issued on a blockchain that can self-execute under specific conditions. One example could be a smart contract for carbon credits, where blockchain can provide an immutable and transparent record of credits, even as they’re traded.
  • Cheaper and more nimble infrastructure. Blockchains are open source, thus inherently cheaper and easier to iterate than traditional financial services infrastructure.

How does an asset get tokenized?

There are four typical steps involved in asset tokenization:

  • Asset sourcing.?The first step of tokenization is figuring out how to tokenize the asset in question. Tokenizing a money market fund, for example, will be different from tokenizing a carbon credit. This process will require knowing whether the asset will be treated as a security or a commodity and which regulatory frameworks apply.
  • Digital asset issuance and custody.?If the digital asset has a physical counterpart, the latter must be moved to a secure facility that’s neutral to both parties. Then, a token, a network, and compliance functions are selected—coming together to create a digital representation of the asset on a blockchain. Access to the digital asset is then stored pending distribution.
  • Distribution and trading.?The investor will need to set up a digital wallet to store the digital asset. Depending on the asset, a secondary trading venue—an alternative to an official exchange that is more loosely regulated—may be created for the asset.
  • Asset servicing and data reconciliation.?Once the asset has been distributed to the investor, it will require ongoing maintenance. This should include regulatory, tax, and accounting reporting; notice of corporate actions; and more.

?

It is the time finally right for tokenization to catch on

Maybe. Financial services players are already beginning to tokenize cash. At present, approximately $120 billion of tokenized cash is now in circulation in the form of fully reserved stable coins. As?noted above, stable coins?are a type of cryptocurrency pegged to a physical currency (or commodity or other financial instrument) with the goal of maintaining value over time. Financial services players may be starting to play with tokenizing—theirs is the biggest use case to date—but it’s not yet happening on a scale that could be considered a tipping point. That said, there are a few reasons that tokenizing might take off. For one thing, the higher interest rates of the current cycle—while cause for complaint for many—are improving the economics for some tokenization use cases, in particular those dealing with short-term liquidity. (When interest rates are high, the difference between a one-hour and 24-hour transaction can equal a lot of money.) Top of Form

Bottom of Form

?

?

The potential benefits of tokenization for financial services providers.

Some industry leaders believe tokenization stands to?transform?the structure of financial services and capital markets by letting asset holders reap the benefits of blockchain, including 24/7 operations and data availability. Blockchain also offers faster transaction settlement and a higher degree of automation (via embedded code that only gets activated if certain conditions are met).

While yet to be tested at scale, tokenization’s potential benefits include the following:

Faster transaction settlement, fueled by 24/7 availability. by tokenization could translate to significant savings for financial firms in high-interest-rate environments. Operational cost savings, Democratization of access. By streamlining operationally intensive manual processes, servicing smaller investors can become an economically attractive proposition for financial service providers. However, before true democratization of access is realized, tokenized asset distribution will need to scale significantly.Enhanced transparency?powered by smart contracts. Smart contracts are?sets of instructions?coded into tokens issued on a blockchain that can self-execute under specific conditions. One example could be a smart contract for carbon credits, where blockchain can provide an immutable and transparent record of credits, even as they’re traded.Cheaper and nimbler infrastructure. Blockchains are open source, thus inherently cheaper and easier to iterate than traditional financial services infrastructure.

Right for tokenization to catch on

Maybe. Financial services players are already beginning to tokenize cash. At present, approximately $120 billion of tokenized cash is now in circulation in the form of fully reserved stablecoins. As?noted above, stablecoins?are a type of cryptocurrency pegged to a physical currency (or commodity or other financial instrument) with the goal of maintaining value over time. Financial services players may be starting to play with tokenizing—theirs is the biggest use case to date—but it’s not yet happening on a scale that could be considered a tipping point. That said, there are a few reasons that tokenizing might take off. For one thing, the higher interest rates of the current cycle—while cause for complaint for many—are improving the economics for some tokenization use cases, in particular those dealing with short-term liquidity. (When interest rates are high, the difference between a one-hour and 24-hour transaction can equal a lot of money.)

What’s more, since tokenization debuted five years ago, many financial services companies have significantly grown their digital asset teams and capabilities. These teams are experimenting more and continually expanding their capabilities. As digital asset teams mature, we may see tokenization increasingly used in financial transactions. Top of Form

Bottom of Form

Tokenization can protect privacy by ensuring that only tokens, rather than a permanent identity number or other PII, are exposed or stored during a transaction.?In addition—where the same person is represented by different tokens in different databases—tokenization can limit the propagation of a single identifier (e.g., a unique ID number). This can help limit the ability to correlate a person’s data across different databases, which can be a privacy risk and also increases the possibility of fraud.

The essential features of a token are: (1) it should be unique, and (2) service providers and other unauthorized entities cannot “reverse engineer” the original identity or PII from the token. There are two primary types of tokenization:

·?????? Front-end tokenization:?“Front-end” tokenization is the creation of a token?by the user?as part of an online service that can later be used in digital transactions in place of the original identifier value. This is the approach taken by Aadhaar to create a?Virtual ID?derived from?India’s Aadhaar Number, as described in Box 21). The problem with front-end tokenization is that it is very user driven, requiring users to be digitally literate and technically capable of both understanding why they would need a token and how to create one online. This could easily lead to a digital divide with regard to privacy protection.

·?????? Back-end tokenization:?“Back-end” tokenization is when the identity provider (or token provider) tokenizes identifiers before they are shared with other systems, limiting the propagation of the original identifier and controlling the correlation of data. Back-end tokenization is done automatically by the system without user intervention, meaning that people do not need to do anything manually or understand why they would need to create tokens, eliminating any potential digital divide and protecting identifiers and PII at source.?Austria's virtual citizen card?is one example of this type of tokenization , and India has also implemented back-end tokenization of the Aadhaar number in addition to its Virtual ID.

The data contained on?Austria’s?virtual citizen card?(CC) is called?“Identity Link”?and consists of full name, date of birth, cryptographic keys required for encryption and digital signatures, and the “SourcePIN”—a unique identifier created by strong encryption of the 12-digit unique ID (CRR) number. To ensure integrity and authenticity, the Identity Link data structure is digitally signed by the SourcePIN Register Authority at issuance. Access to SourcePIN and cryptographic keys on a CC is protected by PIN.

To safeguard user privcy, the eGovernment Act stipulates that different identifiers be used for each of the country’s 26 public administration sections—e.g., tax, health, education, etc.— that a person accesses. A?sector-specific personal identifier (ssPIN)?is created from the SourcePIN using one-way derivation, a tokenization method through which a sector specific-pin is algorithmically computed from the SourcePIN. Unlike the SourcePIN, the ssPIN can be stored in administrative procedures. Public authorities can use the same ssPIN to retrieve a citizen’s data stored within the same procedural sector, for example, if they need to view the citizen’s records or use it to pre-fill forms. However, authorities do not have access to ssPINs from other sectors. Administrative procedures often require authorities from different sectors work together. If authority “A” requires information about a person from authority “B” in another sector, authority “A” can request sector “B’s” identifier from the SourcePIN Register Authority by providing the identifier from their own sector, the person’s first and last name, and their date of birth. The SourcePIN Register Authority then sends the source PIN from authority “B” to authority “A” in encrypted form; however, this can only be decrypted by authority “B”. In order to access the data, authority “A” then sends the encrypted ssPIN to authority “B,” which decrypts it and returns the requested data.

Although tokenization and encryption both obscure personal data, they do so in different ways. In general, tokenization is often simpler and cheaper to implement than encryption and has a lower impact on relying parties, as they do not need to decrypt data in order to use it. Tokens also have the advantage that, because they replace PII rather than hiding it like encryption, it is impossible to recover the original data in the case of a data breach.

Tokenization vs. encryption

At the same time, however, tokenization requires a means of mapping tokens to the actual identifier or PII data values (e.g. a token vault or algorithm)—with the most obvious options being through cryptography or reference tables. This can create issues with scalability, particularly where there is a need to access the actual user data in order to complete a transaction. For authentication this is not always the case, as there does not necessarily need to be disclosure of any personal data in order to prove that the individual is who they say they are. Implementations such as?GOV.UK Verify?and?Aadhaar are capable of managing the tokenization of identifiers at scale by avoiding the need to share data.

A key privacy-enhancing aspect is that the Virtual ID is temporary and revocable. As a result, service providers cannot rely on it or use it for correlation across databases. Users can change their Virtual ID as needed, just as one would reset their computer password/PIN.

As a complement to the virtual ID, UIDAI also introducednbsp;back-endnbsp;tokenizationnbsp; to address the storage of Aadhaar numbers in service provider databases. Now, when a user gives their Aadhaar number or Virtual ID to a service provider for authentication, the system uses a cryptographic hash function to generate a 72-character alphanumeric tokennbsp; specific to that service-provider and Aadhaar number nbsp; which can be stored in the service provider database. Because different agencies receive different tokens for the same person, this prevents the linkability of information across databases based on the Aadhaar number. Only UIDAI and the Aadhaar system know the mapping between the Aadhaar number and the tokens provided to the service providers.

Subsequently, when the user authenticates with the service provider, the ID system again computes the token using the same hash function with Aadhaar number, service provider code, and the secret message as inputs and generates the same UID token. The UID token would always be the same for the given combination of the Aadhaar number and service provider code. The combination of the Virtual ID and UID token increases the level of privacy and security, such as, Certain service providers (“global AUAs”) are allowed to store and use Aadhaar numbers and use the full eKYC API, which returns both the Aadhaar number and the token, along with the KYC data. Other service providers (“local AUAs”) can only use the limited eKYC API using the token and do not receive the Aadhaar number. This will limit the linkability of personal information across databases.

Impact of Tokenism

*************************

Tokenism has a significant negative impact on systems because it:

Crowds out diverse thoughts. Tokenism has significant negative consequences on an organization or society as a whole because it ignores the differences in thought, ideas, opinion, and choice that come from a more diverse group of people—which leads to more innovation. Instead, it supports the idea of homogeneity and creates a culture of sameness.

Forgets about intersectionality. Tokenism often reduces individuals to one single characteristic of their identity—like their race or ethnicity, gender, or sexual orientation—and ignores everything else. This often overlooks important facets of intersectionality in a person’s identity—for instance, the unique experiences of being a Black woman rather than a Black man or being a queer person with a disability. Ignores qualified candidates from marginalized groups. Tokenism’s approach to diversity within a system often relies on a “quota system”—allowing in only a few people from each underrepresented group (“token hires”) while maintaining the predominantly privileged majority. This system disempowers other qualified candidates because once an organization fills its perceived “quota” for diverse demographics, it might ignore marginalized candidates in favor of privileged ones. Puts undue stress on tokenized individuals. For the marginalized individuals who make it into the organization, there is an unreasonable amount of pressure for them to both perform at exceptional levels and to be a representative of their entire marginalized group (for instance, the entire BIPOC community). This can have a major impact on the tokenized individuals’ mental health and can lead to exhaustion and burnout.

Reinforces existing power structures. The most systemic problem with tokenism is that it resists real change—by admitting only a handful of marginalized individuals into the existing privilege system, it reduces their ability to make changes to the organization that would make it friendlier and more inclusive to other marginalized individuals. In addition, it perpetuates the falsehood that because this handful of people “made it,” other marginalized individuals will encounter fewer obstacles to being successful, as well.

How to Prevent Tokenism

Here are a few ways to prevent tokenism in workplaces, media, and other systems:

·?????? Avoid placing unreasonable expectations on marginalized groups. Token employees and individuals experience significant pressure to outperform their privileged peers and represent their entire group. Make sure your systems for evaluation don’t expect higher performance from marginalized individuals than their privileged peers, and avoid treating individuals as stand-ins for an entire race, culture, gender, or sexual orientation. In addition and where appropriate, avoid more generalized terms (like Asian or Black American) in favor of more specific terms (like Chinese or Nigerian American).

·?????? Be genuine in your desire for inclusion and change. Tokenism comes from an ungenuine, perfunctory desire to fulfill an expectation; on the other hand, real inclusion comes from a genuine desire to change the systems of privilege and listen to more voices. Avoid opting for diversity as a symbolic effort of social justice efforts.

·?????? Celebrate diversity year-round. One regular form of tokenism in media is the celebration of short diversity initiatives like Black History Month, in which Black creators receive more features for only a short period of time before an organization returns to its mostly white programming. Avoid extreme cycles like this by working harder to center marginalized groups regularly year-round.

Evaluate the system, not just the people within it. Tokenism works to maintain the existing power system by simply admitting marginalized individuals into it; however, these individuals often struggle to find success in these positions because they must follow the rules and expectations of the privileged class while experiencing micro aggressions or other hostility. To truly increase diversity and inclusion in an organization, evaluate the existing systems and work to create an open system that enthusiastically listens to different

SUMMARY/CONCLUSION on

Tokenism (toh-k?n-ih-z?m) is a form of prejudice that privileges only a handful of marginalized individuals at the expense of the entire group. Tokenism is a form of covert prejudice and only reinforces existing hierarchical power structures. It affects all marginalized groups—including people of color, LGBTQ+ individuals, disabled individuals, and women—and it exists in all systems, including workplaces and hiring practices, media repre

SUMMARY/CONCLUSION on

Tokenism (toh-k?n-ih-z?m) is a form of prejudice that privileges only a handful of marginalized individuals at the expense of the entire group. Tokenism is a form of covert prejudice and only reinforces existing hierarchical power structures. It affects all marginalized groups—including people of color, LGBTQ+ individuals, disabled individuals, and women—and it exists in all systems, including workplaces and hiring practices, media representation, cultural influencers, education and academia, and politics.

要查看或添加评论,请登录

Ashutosh K.的更多文章

  • DeepSeek

    DeepSeek

    DeepSeek Primer: where are we now? by Matt Haldane Even the most ambivalent towards artificial intelligence (AI) have…

    2 条评论
  • Elon Musk

    Elon Musk

    THE OTHER SIDE OF ELON MUSK We cannot always support people. Sometimes cuts must be made.

  • ADDRESSING AN EMERGENCY SITUATION:

    ADDRESSING AN EMERGENCY SITUATION:

    PRESIDENT DONALD J. TRUMP IMPOSES TARIFFS ON IMPORTS FROM CANADA, MEXICO AND CHINA FROM FEBRUARY 1, 2025 ADDRESSING AN…

  • BRIC Country attacked by Trump

    BRIC Country attacked by Trump

    President Donald Trump has once again warned Brics nations of 100 per cent tariff if they attempted to replace the US…

  • Multipurpose use of AI forecasting

    Multipurpose use of AI forecasting

    THE ORIENTATION OF FULLY AUTOMATED FIRMS WILL LOOK LIKE Everyone is ignoring colligative compensations AIs will have…

  • Biography of Donald Trump

    Biography of Donald Trump

    THE EDITORS OF ENCYCLOPAEDIA BRITANNICA Donald Trump (born June 14, 1946, New York, New York, U.S.

  • Donald Trump era begins

    Donald Trump era begins

    DECODING ALL OF TRUMP’S DAY 1 PRESIDENTIAL ACTIONS BRAKGROUND This goes beyond the number signed by Joe Biden on his…

    1 条评论
  • Russia vs Ukraine in Near

    Russia vs Ukraine in Near

    CHRONOLOGY OF EVENT FROM 01.01.

  • Bond yield

    Bond yield

    Bond yield price disturbance by Cental Bank of Developed country percolating to developing economy The 10-year US…

  • Formation of a Private Limited Company

    Formation of a Private Limited Company

    FAQS ON INCORPORATION AND ALLIED MATTERS I have culled this information of and startup or new businessman to…

社区洞察

其他会员也浏览了