Understanding the Impediments and Hidden Challenges of Asset Tokenization Projects
Image Source: Blockbr.com

Understanding the Impediments and Hidden Challenges of Asset Tokenization Projects

Introduction

Asset tokenization, or tokenization of real-world assets (RWA), is often used to describe a process of issuing and processing a digital token that represents an underlying asset such as real estate, gold and event bonds, equities, and other instruments that are mainstream in financial services. These tokens are issues intended to streamline transaction processing with the aim to reduce transaction costs and flatten business processes for processing efficiency. Blockchain in this case becomes the technology that powers the transaction system, while tokenization advances the primary drivers and expanded market opportunities tied to accessibility of the asset class with all the obvious and well-discussed benefits of transparency and liquidity. In this post, I aim to temper the enthusiasm of asset tokenization and understand the hidden challenges that may impede the growth and proliferation of asset tokenization projects, which has been projected to be a multitrillion dollar opportunity for various market participants and new blockchain technology and service providers. Although a successful proof of concept (PoC) may demonstrate the art of the possible, the implementation and successful deployment involved includes not only other market participants – such as broker dealer networks, transfer agencies, fund accounting, and custody, to name a few – but also regulatory posturing around the security and resiliency of the infrastructure and the impact of increased velocity of asset transfer and current security payment and settlement systems. While technical constructs such as a blockchain network infrastructure, key/wallet management, and custody are like the technology approach for crypto assets, the tokenization of existing assets includes a greater degree of operational disruption. I have identified two broad areas that are essential to lowering barriers and reducing the impediments (and costs):

?

·?????A harmonized operational framework

·?????Interoperability on-chain and off-chain

?

I will discuss the relevance of these two topics as essential considerations and develop a perspective that should aid in risk modeling and cost structure analysis of projects. As a data point, the average asset-tokenization project has an average lifespan of 1.2 years, an average cost of 1.5 million USD, and an average of seven vendors. We can focus on reducing the costs of failures by better understanding the business of getting into asset tokenization.

Understanding the Implications of Asset Tokenization

Digitization is the first step in many enterprise and permissionless blockchain projects. Tokenization is the process of converting an asset and rights, or claim to the asset, into a digital representation, or token, on a blockchain network. At this time, it may be prudent to draw a distinction between a (crypto) asset or currency and a tokenized asset. A (crypto) asset or currency is a medium of exchange or a protocol-driven exchange mechanism that often embodies the same characteristics of a real-world currency, such as durability, limited supply, and recognition by a network, while being backed by a common belief system (like a fiat currency). A (crypto) asset or currency also represents a byproduct of trust systems (consensus) as a vehicle to back the incentive economic model that rewards and fuels the trust system of a network, making it a trust currency of the network. A token, on the other hand, can be many things: a digital representation of a physical good, making it a digital twin, or a Layer-2 protocol that rides on the (crypto) asset or currency and represents a unit of value.

This distinction between a (crypto) asset or currency and a tokenized asset is important for understanding the exchange vehicles, valuation models, and fungibility across various value networks that are emerging and posing challenges around interoperability. The challenges are not just technical but also business challenges around equitable swaps. Tokenization of assets can lead to the creation of a business model that fuels fractional ownership or the ability to own an instance of a large asset. The promised asset tokenization on blockchain-based business networks is not just digitization and a solution to the inefficiencies of time and trust, but it also creates new business models and co-creation from synergies of network participants that did not exist before.

While blockchain itself provides the technology constructs to facilitate exchange, ownership, and trust in the network, it is in the digitization of value elements where asset tokenization is essential. In essence, digitization is sort of a prerequisite to tokenization. In the financial services context, digitization of existing services and token-driven DeFi present two parallel business streams, which will converge as the industry aims to provide a unified user experience. Tokenization implies that account management and claims on assets are driven by cryptographic keys, as opposed to account management and asset management by a system operator called a bank. Though tokenization is more than just account management and claims to an asset, it enables divisibility, fungibility, and disintermediated business functions, such as asset transfer. It is a fundamental building block and prerequisite for an “Internet of Value.”?


Harmonized Operational Framework

There are several unique operational characteristics that are specific to the financial services industry and need to be given adequate attention as an enterprise embarks on an asset-tokenization journey. Each asset class represents its own risk and operational framework, and the lifecycle of these assets are managed by employing people, processes, and technology to comply with regulations and ensure prudential treatment of assets. When we tokenize assets, we essentially create a new infrastructure (Rails) for every instrument (asset class) which creates a velocity mismatch as tokenized assets have a different transaction velocity than their dematerialized counterparts. The current systems rely on series of messaging systems and a system that does batched relays that ensure the reconciliation of assets as the assets move through the current financial market infrastructure. While tokenized assets exist to solve this problem, ensuring transaction finality is achieved on a blockchain-powered transaction infrastructure. As financial institutions innovate with asset tokenization, the coexistence of these two infrastructures is a huge consideration around operational management and the cost structure of maintaining two systems for same asset class. This coexistence will require the legacy system to be a harmonized system of record (SoR) to ensure there is a definitive and authoritative book of record. This harmonized SoR implies a massive integration effort between the asset-tokenized infrastructure (blockchain, custody, etc.) and current core banking systems that maintain accounting books of record (ABOR). To overcome this legacy burden, which crypto does not have, asset tokenization will need integration with clearing and settlement systems (transaction settlements), market data systems (asset valuation), and other reporting and regulatory systems (KYC, sanctions, checks). The core technology constructs like blockchain infrastructure, key/wallet management, digital asset custody, and so forth are similar as the technology stack is borrowed from crypto, which remains unconstrained and enjoys innovation at an unprecedented scale, but the technology debt and costs of maintaining this permissioned blockchain network are significant considerations. Asset tokenization is a transient phase which involves reliance on existing legacy systems while building a new system and ensuring minimal operational impact to end clients. The goal is to ensure we are addressing the operational aspects of transaction processing for digital assets with harmonized operations that can account for, reconcile, and manage the two different types of asset classes – tokenized and dematerialized assets – with varied modalities. This requires attention to system integration and can be achieved by providing a layer of abstraction that provides configurable and definitive linkages to authentication, authorization, and accounting (AAA) systems.

?

No alt text provided for this image
Image Source: EY

?

The Single Most Import Design Choice: Interoperability

????????As the industry gears up for a tokenized future, every experiment and pilot brings us closer to a reality that includes modernization of our aging financial infrastructure that enables transactions with tokenized assets and reaps the advertised benefits. While these experiments are done independently and in some cases by industry consortiums, the narrative is around infrastructure modernization and not around market structure and the roles of market participants. In a previous article , I discussed the need for a capital market structure in the context of crypto, but I think the inverse is true for tokenized assets and perhaps the emergence of new token market utilities . But regardless of the evolutionary changes, interoperability is the single most important design choice that an enterprise has to make to ensure smooth and intended outcomes from the pilot.

In the context of distributed ledger technologies (DLTs), interoperability involves enabling the frictionless flow of data, value, and token “business logic” across distinct homogeneous and heterogeneous networks while maintaining its trust and security foundations. Interoperability between blockchain systems is one of the barriers to the widespread adoption of blockchain technology and DLT in general. Due to design limitations, asset class requirements, enterprise preference on technology, or even regulatory consideration in technology design, efforts thus far have led to an interesting mix of technologies and designs. For various market participants which are either a utility or in an asset-servicing business, there may not be a choice other than to support the various dominant emerging technology stacks underlying these tokenized assets. And as we move the assets or information about the assets, we need to adhere not only to existing norms but also to blockchain principles of trusted transaction processing. An interoperability architecture for DLT networks is thus required to enable the secure movement of digital assets across multiple DLT networks while satisfying the transfer atomicity, consistency, and durability criteria. The architecture must realize that there are various DLT networks, and their internal constructions may be incompatible with one another. It’s therefore recommended to include the following design imperatives in asset-tokenization projects:

?

·?????Secure messaging about assets and the state of the assets across chains

·?????Asset transfer across two distinct chains (networks) with a homogenous technology stack

·?????Asset transfer across two distinct chains (networks) with a heterogenous technology stack

?

These design imperatives can leverage the tried and tested technologies from the crypto space, such as oracles, trusted centralized bridges, decentralized bridges, bi-lateral HLTC (hash time locked contracts), decentralized oracle networks (DON), asset bridges, etc. These technologies serve an important purpose of ensuring trusted transaction processing with tokenized assets within the fragmented nature of the current market infrastructure. It is conceivable these technologies will shape new, much needed market utilities as tokenized assets take center stage and force shifts in the roles of current market participants, as by nature technology that enables asset tokenization is a disruptive force.

?

Conclusion

Asset tokenization, or the tokenization of real-world assets (RWA), is often used to describe a process of issuing and processing a digital token that represents an underlying asset. I have identified two broad areas that are essential to lowering barriers and reducing impediments (and costs):

?

·?????A harmonized operational framework

·?????Interoperability on-chain and off-chain

?

When we tokenize assets, we essentially create a new infrastructure (Rails) for every instrument (asset class) which creates a velocity mismatch as tokenized assets have a different transaction velocity than their dematerialized counterparts. Asset tokenization is a transient phase which involves reliance on existing legacy systems while building a new system and ensuring minimal operational impact to end clients. The goal is to ensure we are addressing the operational aspects of transaction processing for digital assets with harmonized operations that can account for, reconcile, and manage the two different types of asset classes – tokenized and dematerialized assets – with varied modalities. I have identified “interoperability” as the single most important design choice that an enterprise has to make to ensure smooth and intended outcomes from the pilot. As we move the assets or information about the assets, we need to adhere not only to existing norms but also to blockchain principles of trusted transaction processing. An interoperability architecture for DLT networks is thus required to enable the secure movement of digital assets across multiple DLT networks while satisfying the transfer atomicity, consistency, and durability criteria.

These technologies serve an important purpose of ensuring trusted transaction processing with tokenized assets within the fragmented nature of the current market infrastructure. It is conceivable these technologies will shape new, much needed market utilities as tokenized assets take center stage and force shifts in the roles of current market participants, as by nature technology that enables a

A brilliant expose of all the challenges that remain for this Cambrian explosion of innovation to succeed in creating the “internet of value”. None of the challenges are insurmountable. It will take time but disruptive technology wins at the end

回复

We had faced the issue of interoperability between physical and dematerialized assets in the context of market settlements using Security Depositories in India. Yes, this was the difficult problem three decades ago, and continues to be so today.

回复

A brilliant, sober analysis of what is required to achieve the promise of tokenized real world assets. We'll done Nitin Gaur!

Ralf Kubli

Building the Decentralized Economy

1 年

Great points and clearly pointing out that there remains much to do around the "plumbing" to guarantee interoperability. I would contend that ASSET INTEROPERABILITY is equally important and without a clear definition of the financial asset underlying the tokenized record will create rails to nowhere... Which is why the industry should be build on the open source algorithmic standard ACTUS. In addition, I think "RWA" is a problematic term in itself. Finance has come to define the real world on balance sheets with Financial Assets, Intangible Assets, and Tangible Assets. Why don't we use well established terminology in Finance to describe what we are tokenizing?

Chainlink Labs ??

要查看或添加评论,请登录

社区洞察

其他会员也浏览了